Apparently it has been discovered that the BBC have been paying Google to place its website at the top of listings for a series of keywords, as part of its internet marketing plan.
With a guaranteed £3.5 billion revenue each year, the BBC holds a strong position in the market obviously, making its competitors struggle. It is said that the BBC has been paying Google – as part of a £100m marketing spend – to improve its search ranking. The corporation itself recognised the importance of high Search rankings and stated:
‘ Promoting content like the Mercury Prize online is an effective way to inform the licence fee payers who will want to watch it or read about it’
The BBC sees a huge opportunity in capitalising on Search and making sure it is achieving top organic search results.
One of the things most search marketers will (and should) tell their clients is that a top result in Google cannot be bought. Therefore the news is quite surprising. There are rumors that in some cases deals can be made with Google, but there hasn’t been any real proof. BBC had a deal before with Google, where they had put up content on YouTube.
Still, if the rumor would be true, it would be a big thing. It could however very well be another case of a reporter not knowing the difference between an organic or a paid result.
The battle between Bing and Google has heated up with both sides agreeing to deals with micro-blogging site Twitter. In addition, Microsoft has reached a separate agreement with Facebook, while Google is launching its own, unique search tool for social networking sites.
User demand is behind decisions by Microsoft and Google to include social networking in search results. While both search sites update their index of web pages regularly, they still struggle to cope with very recent information such as current events. While both Google and Bing have dedicated searches of news websites, that doesn’t cover comments and reports by non-journalists, including those on hand during a major event — information which is available through social networks.
Twin Tie-Ups For Twitter
Twitter appears to have pulled off a smart marketing move by having deals with both search giants announced within hours of one another. Bing has already released a beta edition of its Twitter search which, unlike the facility on Twitter’s own site, includes a list of the web pages which receive the most links in Twitter posts. That’s a useful way of finding the latest talking points.
Google Wave was unleashed for public testing recently, with 100,000 invites being sent out; and with those a batch of invites, tied to the invites (think Gmail at it’s inception).
Catch the Wave, it’s setting itself to be the most interactive, social media collaberation idea out there. Email, instant messaging, documents, which can be edited by a list of people selected by you, it’s a step to making the internet just a little more accessible. Google has a love affair with making information freely available, from their book digitizing (currently on hold), to the actual search engine and indexing of the majority of the internet, to handing out free services like Gmail, and now the Wave service.
Wave itself is still a good few months off yet, admittedly still buggy in their own blog writings, but when the Wave gets rolling, it would be best to be on it and ready for the ride.
Step 1: Don’t worry. This occurs regularly, and can cause a lot of movement in rankings, meaning that it’s come to be feared by many in the SEO industry and anticipated by others. The update isn’t just one sudden switch, though, as each index update takes several days to complete. During this update the searches seem to ‘dance’ between the old index and new index – that’s the Google dance.
So why does it happen? Well, Google pulls its results from thousands of servers, and they can’t all be updated at once. Instead, each server is updated with the new index, one at a time. This can cause very strange behavior in the page rank process if two major sites located on separate servers happen to have a close linking bond. These sorts of separations are interesting and can contribute to a great deal of change and motion in page ranks. The most important thing to keep in mind is that eventually Google will rank you where you belong. Generally, if you behave, you will not be thrown around for long by the odd activity that can occur when Google is in the process of updating its index.
One common misunderstanding is the idea that Google controls which server each kind of information is coming from, and so stores similar information on the same server. Google’s index doesn’t work this way – it’s a big, disorganized mass of information that Google searches very quickly. This is a blessing in disguise because it allows your site to remain reachable via other sites that are related to it when the index is taking place. Your site generally won’t suffer for too long when an update is taking place anyway, but if you are heavily dependent on Google results, you will see a slight drop for a short period of time. This drop is often followed by a slight spike especially if your page rank has increased since the last round of indexing.
The servers that Google uses are distributed between datacenters all over the world. Google doesn’t keep all of those eggs in one basket – they want to be able to lose one datacenter and have the rest survive. This way, if part of Google goes down, people can still use the search engine and as was said before, this allows your site to be accessed via related sites if the server holding your sites index happens to go down. The datacenters that Google has put into play are enormous in comparison to most datacenters around the world. Google rivals some of the largest datacenters in the world with each of its datacenters and is probably the largest in the world if all were combined into one.
You see, the ‘time-to-live’ for Google is only five minutes – that means that Google can change it’s IP address every five minutes. This allows them to switch between their datacenters regularly, spreading the search load between them intelligently and routing around any damage. If you constantly entered the same datacenter with every search that you placed it would almost certainly fry within twenty four hours. Considering the number of users on Google each and every day, it is surprising that thn servers they do have is enough. A server can only handle so much traffic in a day and Google insures that it can hold more than any other service on the internet.
The datacenters updating their indexes at different times causes Google to do its dance. Unless you’re looking for your website’s ranking, you’d never notice this as your site is normally available at all times. The unfortunate bit is that often times you will lose your page ranking for a short period of time or your site will seem to have a lower number of pages indexed by Google. If you insure that you have several hundred pages available on Google at all times you will most likely be able to provide all of your content at all times either directly or indirectly.
Step 2: There is no step 2
With it’s seat firmly set on being the king of search, Google is constantly growing, and evolving. It’s a living, breating, life sustaining organ of the web, and as such, any moment of service problems is almost immediately noticeable.
Gmail, Google news, Blogger, Youtube etc. the list of companies under Googles umbrella, and in their repetoire is quite large, and seeing as acquisitions are once again on the table, soon to be growing. With such a huge toolkit of technologies available to them, it should be understandable that the odd disruption of service were to happen; even though it is rare.
And now, with Google encouraging users to change their productivity apps from the desktop, to the “cloud”, loss of service is beginning to become an issue. Not because it happens every day, or even if it happens once a month. The simple idea behind cloud computing being having “your computer” available to you anywhere is an incredible incentive. But, if when you go to use the cloud, you can’t access it because of a glitch, programming error, someone trips on a plug etc, it is a problem.
It’s like showing up to your current workplace, and your computer just not booting up. And the tech manager, is in the next city over, trying to communicate to you what’s wrong over the phone, but because he’s using sign language you can’t tell what’s going on, or when it’s going to be fixed. All you can do, is wait until then.
Cloud computing, may very well end up being the greatest boon to business productivity the world has seen to date. But as of right now, it’s still a brand new technology, and as such, will encounter hiccups, glitches, crashes, and downtime. Should Google be knocked, stripped, and beaten down for it? Not in my opinion, but everyone has their own.
For everything that Google does impressively, how easy is it to forget, when they’re trying to make a step into a previously, unknown sector.
In it’s bid to digitize the world, Google has moved forward again in it’s application to digitize the worlds libraries. Lately however, things have taken a decidedly more serious turn with the Department of Justice weighing in with it’s concerns. The DoJ became the latest party to file its concerns about Google’s book settlement and it appears the search giant will have to either make tweaks to the deal, or allow the feds—and maybe even Congress—to poke around. You should be betting on the tweaks.
While there were some good points made by the DoJ :
The United States strongly supports a vibrant marketplace for the electronic distribution of copyrighted works, including in-print, out-of-print, and so-called “orphan” works. The Proposed Settlement has the potential to breathe life into millions of works that are now effectively off limits to the public. By allowing users to search the text of millions of books at no cost, the Proposed Settlement would open the door to new research opportunities. Users with print disabilities would also benefit from the accessibility elements of the Proposed Settlement, and, if the Proposed Settlement were approved, full text access to tens of millions of books would be provided through institutional subscriptions.
They also made it very clear, that there are valid concerns about the settlement as it exists now.
..the breadth of the Proposed Settlement – especially the forward-looking business arrangements it seeks to create – raises significant legal concerns. As a threshold matter, the central difficulty that the Proposed Settlement seeks to overcome – the inaccessibility of many works due to the lack of clarity about copyright ownership and copyright status – is a matter of public, not merely private, concern. A global disposition of the rights to millions of copyrighted works is typically the kind of policy change implemented through legislation, not through a private judicial settlement.
In the end, Google is most likely going to make a few tweaks to it’s compensation agreement and try to placate everyone as best as possible.
Microsoft Bing has done rather well since it launched. But there is much more to come, with the site set to evolve and expand in the coming months. In fact, Bing 2.0 could be just around the corner. Meanwhile, Microsoft has had to deny it has an obsession with porn after an advert for Bing was discovered on Google alongside search results for “pornography.”
Bing is truly managing to do something I never thought I’d see – it’s weaning people off an over-reliance on Google. I admit, I’d got to the point where I used Google automatically, never even giving another search (or decision) engine a thought. But then Microsoft launched Bing in May and I saw there was a viable alternative out there. I now use both on a regular basis and am happy the search giant has some big-name competition at last.
It isn’t perfect however, with some features needing to be tweaked and some obvious areas ripe for improvement. Andbeen that’s exactly what Microsoft is planning to do sooner rather than later.
Monte Enbysk, senior editor at Microsoft Office Live, wrote, “Bing 2.0, out this month, has some exciting new features. Imagine seeing maps plus pics from the neighborhood of a restaurant to try.”
As usual, Microsoft has played dumb over the speculation, stating, ““We’re very excited about some of the new Bing features set to roll out over the next few months, but have nothing to announce today.” But a release during September seems assured, with some indicating a launch this coming week. I suppose we’ll just have to wait and see, and then hope the improvements are noticeable.
Less savory is the discovery of an advert for Bing on Google when “pornography” is searched for. TechCrunch made the discovery, so I have to assume one of their staffers likes perusing NSFW content while at work, and also doesn’t know that “porn” is now used instead of “pornography by all but the upper classes. Probably.
Microsoft denied purchasing ad placement on searches of this kind and concludes “free videos” is more likely to have triggered the ad showing up. Which may well be the case. But that doesn’t change the fact that Bing is widely regarded as the search engine to use to find porn. Microsoft might not like that reputation but I’m sure the traffic that comes its way as a result isn’t unwanted.
Aside from CEO Steve Ballmer scolding a Microsoft employee for flaunting an iPhone, Bing 2.0 was the biggest news to leak from a private company meeting on Thursday. Yes, it appears that the software giant is about ready to relaunch its search engine and great Google killer, according to a burst of unconfirmed employee tweets.
But, whether a few new bells and whistles will move the needle for Bing is hardly certain. Despite millions upon millions in marketing dollars, the search engine still trails far behind Google.
Net Applications estimated that Google held 81.22% of search engine market share in June, followed by Yahoo at 9.21% ; Microsoft’s Bing at 5.31% and MSN Live at 0.66%. Hitwise, meanwhile found that Bing’s market share was just 5.25% in June — including MSN Search and Live.com.
The search engine titan’s stock has more than quintupled since it went public in August 2004. But will Google’s next five years be as successful as the past five?
Can you believe that it’s already been five years since Google went public?
The search engine giant debuted on Aug. 19, 2004 at $85 a share. Today, the stock trades at about $445. That’s a nearly 420% return during a time when the Nasdaq is up only 8%. And shares of top rival Yahoo! have been nearly cut in half during the past five years.
Yet, it doesn’t look like all those Googleaires are too interested in celebrating their 5-year anniversary as a public company. Check out the Google (GOOG, Fortune 500) homepage and you don’t see one of its usually witty cartoon renditions of the logo like you do on other “holidays.”
Nonetheless, it’s been an interesting five years for the search giant to say the least.
The company has used its strong stock price and mountain of cash reserves as currency to scoop up the likes of YouTube, DoubleClick and Postini to name a few.
Google has also remained relatively focused its core search business, resisting the temptation to go overboard in the glitzy, but not all that profitable, social networking business. And that’s a good thing.
It’s common knowledge in the industry and online that Google Inc also bought my company happily named Google back in 2004 before going public.
I often wonder what would have happened if I had not agreed to accept their out of court settlement and NDA back then, mind you I can take away many friendships made and the knowledge that we are one of the few elite company’s in the world who actually know how it is done.
May be I will write a book one day, we only agreed a 3 year non disclosure and keep your mouth shut until we are up and running.
Nothing has changed really, Google is fundamentally the same, it works the same, they have just added and titivated a few thing, how do we know this? well everyone of our clients is still page one, yes even 5 years on.
It turns out that consumers rely on images in search engine result pages more than Google and Microsoft execs thought. Knowing this can help SEO professionals better optimize sites, according to executives from both companies who spoke at the Search Engine Strategies conference last week in San Jose, Calif.
It also turns out that image search is very important. Todd Schwartz, group product manager of online services division for Bing at Microsoft, shed light on the types of queries that consumers search for on Bing, he says. Following “Web search” categorized as a “vertical,” images took the No. 2 spot.
Not surprisingly, R.J. Pitman, director of product management for global search properties at Google, believes that image search as a vertical sits at No. 1, rather than No. 2. The growth comes from the “more than 1 billion” camera phones being sold yearly, and the ability to share pictures. Google sees “hundreds of millions of searches daily across billions and billions of images,” he says. Images are no longer a “nice to have, but a must have” piece to promote businesses online.
Pitman says Google has begun to rank images based on the quality of the image. People need to stop thinking about the photos as images and look at them as digital bits of information, where pixels in the frame actually mean something. Google considers more than the sitemap feeds, title tags and attached metadata when ranking images. The search engine now looks at what’s in the image. It helps Google find and serve up similar images through object and facial recognition, according to Pitman, who says to consider these facts to better optimize “when building next-generation Web sites.”