Tag Archives: google

Search Market Share Numbers & Transparency

With all of their updates that have been applied in the last while, Penguins, Pandas and who knows what else is coming, it’s becoming fairly common to read the occasional article on how poorly Google is faring as a search company. The news headlines are even beginning to creep into mainstream media more and more often, especially with Google+ trying to creep into Facebook territory.

But when you start to look at the numbers, year over year, nobody is really going anywhere. Where online search is concerned, just over 2/3 of the users choose to use Google as their search engine when looking for information online. The Bing/Yahoo machine (since Bing provides all of the results for Yahoo) stayed at a near 30% search share for the month of May, overall a loss of search share for the duo. Bing remained constant from April, and gained from a year ago, but since they’re filling the role of search engine for Yahoo, it is only logical to lump the pair together. The remainder of the search market is taken up by everyone else, Ask, AOL, and all of the other smaller engines out there like DuckDuckGo. These numbers are relevant to the desktop search market.

The mobile search market is much different than the desktop variant. While there maybe a much more varied platform base in the mobile market, it is absolutely dominated by Google, taking up the monster share of 95% of the US market. It seems that regardless of how much some SEOs decree the death of Google as a search engine, that the general user disagrees. At this point in the life of the web, the original search engine, is still the best search engine, going by the numbers. Your personal use and interpretations will vary somewhat from the general public.

Google, the Government, and you

Going over the search share numbers, it’s very plain to see that Google is sitting on the largest share of the pie, by a very clear margin. Being a company of such a huge size, with such a massive market share, makes you an impressively large target to take aim at. A couple of years back, in order to make information more available for view, Google began a new feature they dubbed as a transparency report. The introduction of the information was to give the general public an idea as to the types of removal requests the company faces on an ongoing basis. They’ve released their fifth data set, which gives a fairly clear timeline of events and online postings, and in their blog post from yesterday, Google has noticed a disturbing trend.

“We noticed that government agencies from different countries would sometimes ask us to remove political content that our users had posted on our services. We hoped this was an aberration. But now we know it’s not.”

It should be no surprise that governments are keenly interested with online activity and online content, it was only a short time ago that portions of the internet went black in opposition of the proposed SOPA bill. But even though governments have been requesting blog posts, videos, and sometimes even entire websites to be removed from the index, in the end they are just that; requests. And with the nature of requests, can come denial, which is what Google has been doing with most of the requests they’ve received. You can delve deeper into the report by following this link and it’s safe to assume that other search engines often receive the same requests to remove content from their index as well.

Getting Re-Indexed and Search Dominance

We’ve been over the steps of what you need to do when you’ve been penalized and dropped from the index, but once you’ve followed all of those steps, you might be wondering just what’s next? To recap quickly what you should do, first go over your email (which you most undoubtedly have) and follow their major points of issue. If it’s bad backlinks, do your best to have the removed. Spammy content? Get a handle on it and rewrite it. Found out your SEO is playing the black hat game of gaming the engines instead of working with them? It’s time to drop them and call the real experts in search. After all of those steps, you resubmit your site for inclusion.

But once you’ve done all of that, it’s in the hands of the search gods. It’s where you need to sit on your hands, and wait for them to decide if you’ve done enough, to be reindexed and included back into the search rankings. What some people don’t realize though, is sometimes the search engines don’t fully clean your record, it may only be a partial pardon, incentive really to clean the rest of your act up. Just like search engine optimization isn’t just a black and white industry, neither is directing traffic at Google or Bing.

So, just how relevant is too relevant? It’s a question being asked lately as more and more often, the results page tends to be over taken by the same website. There was a short video put out by Matt Cutts and the Google team, trying to describe just what’s going on.

The method for displaying these newer results however, have been getting under users skin however. How diverse do the search results really look, or seem, when the top three or four, and sometimes the entire page, is taken up with a single result? Relevance to the search query is obviously which drives Google and other search engines to deliver their results, and the better refined they the better it is for the end user. Have you had any instances recently where the search results page has been dominated by a single result?

The Search Algorithms and Your Website

The web is a huge place, full of anything you can think of at any given time, because chances are if you can think of it, someone has made a website or web page for it somewhere. It could be as common as people writing about the latest movie or song, or it could be as low key as a new local band for instance, but if you were to hit up a search engine you will almost always find at least a webpage about it.

And with all of the billions and billions of web page and websites out there, it creates a market, and with any market comes the marketers. Search engine optimization, adwords, white hat, black hat, when you start reading about the industry you will find yourself running into terms which become more and more unfamiliar as you go. It’s no wonder that when you start having the conversation with a prospective, or sometimes even existing client, that the question comes up “Do you know how Google/Bing/Yahoo works? Can you promise me number 1?” Now the polite, short answer to that question is “No” and the long version is “No, we can’t promise number 1″. And then the inevitable happens, they utter the beginning of the worst phrase you can hear as an SEO “But I read/heard/was told that..”

Here’s the short reason why we can’t guarantee you number 1 in search for your business: the web and the search algorithms are always changing. When Sergei and Larry initially created the Google algorithm to run around and start indexing the web, it wouldn’t be a surprise to hear they never imagined it would get so massive. It’s rumored that the algorithm that runs now has somewhere between 250 and 300 ranking factors in it as it parses your website. And some of the confusion for those on the outside of the market, is when they read an article about how someone has cracked the algorithm to always rank on the top. I apologize for being up front, but anyone who tries to tell their clients that is a conman. At this stage of the search game, with as long as the algorithms have been changing and adapting, I doubt there is any one person employed by Google or Bing, who can sit down and tell you just how it works. Because at this point, they are just too big, too complex, and take into account so many different points that it’s mind boggling.

So your best course of action, is to adhere to the KISS principle, Keep It Simple Stupid. Don’t get crazy with your site, don’t get too smart with your content and follow the best practice guidelines; and you’ll be okay.

Facebooks Public Offering Coming

The largest news on the web as of late, has got to be the flurry of activity surrounding Facebook. And just in case you’ve been living under a rock for the last while, Facebooks IPO is about to break into the open.

The largest IPO being put forward in history, Facebook is about to offer itself up for just north of $100b (yes that’s a ‘B’) to the stock buying market. It’s a massive pool of cash that it wouldn’t be uncalled for if Zuckerberg would hop into a giant vault and swim around a bit ala Scrooge McDuck. The offering smashes the other tech giants in comparison, and obliterates Googles offering of just under $3b just a few years ago. A fair amount of hype has cropped up surrounding the number, along with the murmuring of Facebook possibly even taking out Google with their incoming influx of cash.

There is however, the other side of the equation, Google and Facebook aren’t in competing markets. Facebook, is the dominating social network online with nearly a billion accounts, and Google is the reigning king of search. Both players have dominated their respective markets, and have carved their own living out of paid advertising. And it’s the advertising angle, that some marketers believe where Facebook will be stealing money right out of Googles coffers.

Recently at SMX London, Amit Singhal opened the talks with some rather interesting information about Google, and about how they have no idea how it all works. That’s a rather broad statement actually, and there is some definition to be made. During the question period, someone asked Amit how much money Google makes on algorithm changes. Contrary to what the tinfoil hat wearing people believe, Singhal was adamant: “no revenue measurement is included in our evaluation of a rankings change.”

That might seem rather preposterous when you look at how their revenue model works, after all the search giant has made its seemingly limitless billions on search. Going on further, Singhal even opened up on the fact that no one knows exactly how everything works (all of unpaid search, AdWords, Android, etc.), he has a pretty good idea of how all of unpaid search works. Just some interesting food for thought, as the conspiracy theorists out there seem to think Google tweaks the algorithm when they want a cash injection.

SEO Terms and you

Since we covered the very basics of how web developers, designers, business owners and SEOs could work together a little better yesterday, lets get into a tad more detail. Taking it a little slower, we’ll just discuss a handful of some of the terms you’re going to run into when working with a search engine optimization firm.

Once we’ve had the chance to take a good hard look at your website, one of the first few things you’ll find us talking about is about conducting keyword research. Basically all this means to you as a website owner, is we need to know what terms you’re interested in ranking with, and we’ll break down your content to see if those keywords exist in a workable combination. It’s also a step taken when we search for your current listings, and breaking down how you stack up versus your competitors. It’s a simple step, one which gets abused at times unfortunately when some believe that spamming their keyword as many times as possible is a good thing. Also tieing into your website and it’s current performance, is Page Rank. It’s actually not as huge a metric as it once was, but it’s a ranking system created by Google’s Larry Page which gives your site a number based on a number of factors. Authority of incoming links, the quality of your content and website, and this rank is passed on through out your site. It used to seem that the higher your page rank, the higher you sit in the SERPs, but Google hasn’t been as diligent in up keeping their system, with Panda and Penguin being introduced in the last couple of years.

Once we’ve determined what you want to rank for, how you stack up currently in your niche market and where to focus our efforts, you’re going to start hearing terms like geo-targeting, and click through rate a whole lot. Geo-targeting is the process of constructing your website and it’s pages, to be specifically relevant to certain areas. You can easily work in city geo-targeting into your site with adjustments to content, and you can even drill down into neighborhoods if you begin to use tools like AdWords etc. With targeting your website, you ensure that you’re working at capturing your target market, and increasing your over all click through rate. Click through rate, loosely defined, is the percentage of searchers who click on your link after performing a search. It’s a great metric to keep measurement of, as it can fairly quickly outline for you if a new campaign, or advertising strategy has had a positive or negative effect on your brand and business.

Bing Gaining on Google?

Currently Bing is going through a transformation of sorts, they’ve revamped their look and performance, changed up the way they do social, and tried to streamline everything overall. The current end result: in their own internal testing they’ve come out ahead of Google. A near 10% gain while Google lost 10% of their score during testing, so what’s Bing been up too?

Firstly, they’ve been working hard at incorporating more of the social web, into your search results. Earlier in the year, Google introduced their version of this idea as Search+ your world, and was met with the ire of masses. The claim was made that Google was favoring their own social network and shunning Facebook and Twitter, with Google counter arguing that they couldn’t gather information from those sources. Bing currently, manages to pull information on searches from all of these sources, Facebook, Twitter, as well as Google+. It may seem as though Google was just blowing hot air, but it needs to be mentioned that late last year Twitter did effectively block the search engine, and Facebook keeps a pretty tight handle on what gets out onto the web, even with open and social profiles. Microsoft Bing, currently has deals worked out with both of these parties to index their information, and Google+ profiles, if they’re set to public then everything on that page is indexed as a public website.

Bing used to have your social mixed in with your search results, but they decided to change that idea and went in a completely different direction. All of the social search results have been shoved off to the right side of your screen, where your friends, family and colleagues are ranked as per relevancy based on your search. Also included in those social results are people and items which may also be relevant to your search. The reason for the change according to Bing, is having the social results mixed in with organic, they felt that it diluted the page too much, and your searches would be affected.

So where does that leave us, Bing is in the process of launching their completely revamed search and social service, and they’ve made big gains in the search world, based on their own internal testing. A blog post on that point makes it a little clearer:

We regularly test unbranded results, removing any trace of Google and Bing branding. When we did this study in January of last year 34% people preferred Bing, while 38% preferred Google. The same unbranded study now shows that Bing Search results now have a much wider lead over Google’s. When shown unbranded search results 43% prefer Bing results while only 28% prefer Google results.

Ever Changing Web Technology

Along with all things, changes to the way we use the internet happen on a daily basis for the most part. Starting from a single browser interface, to now having a half a dozen available to use depending on preference and platform, web tech has been changing and evolving almost as fast as the web itself.

Take browsers for example, just a few years ago in 2008, the online world was dominated by Internet Explorer, followed up by Firefox and just a sprinkle of the odd ones here and there. That was the year that Google Chrome was introduced, and since that time, the number one seeds have changed some. As of the start of 2012, there is a fairly even split of the browser market going to the top 2, Firefox and Chrome as the most widely used, Internet Explorer coming in at a distant third and the rest, still just a smattering on the internet landscape. As of March 2012, Internet Explorer has dipped under 20% of the browser landscape, thankfully at least half of that market uses an updated version of the browser, with version 8.

But browsers aren’t the only change we’ve had in the last few years online, social media has become a massive market on the web. The largest player in the space needs no introduction, Facebook entirely crushes the social market with having around a half billion users logged in on average per day. The unencouraging portion of that number however, is that nearly half of the businesses out there, don’t even use social media marketing to their advantage. Only about 20% of the businesses out there are even using Facebook to push their brand and market, with the smaller business owners more readily embracing the technology. Knowing it’s an avenue that needs to be explored, and taking that step to do so are two different things, and it seems that a lot of the time it’s people that try and make it complicated. Any concern for marketing is return on investment, and while organic search engine optimization is the best return in the business, it’s cost and time factors make it difficult for those with very shallow pockets. Freebie advertising though, like that can be found with Facebook and Twitter, can be easily measured however, broadcast your ad/tweet, and measure your traffic over the next couple of days. It’s not magic, it’s simple math when you have to keep it basic.

The Goliath Complex of SEO

The goals of SEO are relatively simple, to make your site rank as highly as possible within the search pages for your niche. Whether you build houses, write stories, or draw pictures, search engine optimization is applicable for any website online. What a lot of smaller business owners can also use SEO for, is to knock the big players down a peg or two.

It’s an important step for all parties to consider SEO as a great equalizer online, you do however have to remember to stay within the rules. There are billions of web pages online, and yet with that daunting number in mind it’s still a relatively simple process to stay within the sights of the search engines. All you really need to keep in mind are the basics, even just following the best practices guidelines gives your website a shot at being picked up and indexed. But you need to also remember, the internet isn’t exactly a friendly place yet, a great deal of the web is free and wild. As a small example, you can’t control what websites choose to link to you if they choose too. This can be a difficult hurdle to overcome as well, as irrelevant, or inappropriate back links leading to your website can seriously hamper any SEO efforts you may have in place. This is only a single element of what’s known as negative SEO.

The larger, more established and authoritative sites such as Amazon are somewhat safer in this regards, however no one is completely immune to negative SEO. Negative search engine optimization can be defined as spammy links, blatant keyword stuffing, duplicate content or anything that isn’t considered white hat SEO by the search engines. Smaller, newer sites unfortunately are more susceptible to negative optimization problems. In the beginnings of a sites growth, it may not have much content or links pointing to it. If you’re not careful with how you craft your content or structure your links and navigation, you may even get dinged as having duplicate or irrelevant content in your niche. The number one point however that you need to keep in the forefront of your mind though, because the internet is still wildly untamed, the playing field is actually relatively plain and simple. Follow the rules, manage your website and monitor your content to make sure it doesn’t get scraped or that it has been copied from another resource. Even the big hitters can be taken down online, no target is too big or too relevant on the web.

Google’s Penguin Attacks

So the large update that Google pushed out late last week, which has a name you can now curse – Penguin, has had it’s share of folks caught in the crossfire and been down ranked. In case you were wondering what the update was about, the short version of the update is it was targetted at directly reducing webspam, and sites which use “aggressive spam” tactics.

As always, Matt Cutts came out on his white horse maintaining that so long as you create quality, original content, and stick to the best practices, that you should be alright with this new update. What has been discovered over the weekend however, and something that site owners couldn’t entirely be prepared for, was the effect that would be felt by targetting spammy sites. While as a site owner and web admin you can control what content is contained within your site, you unfortunately, have very little control as to who, or what, links to your site.

Larger online brands have felt little change at the moment with the update, but that doesn’t help any of the smaller sites out on the web. While Google mentioned that only 3% of the search results would be effected, it seems as the week gets underway that number will be a tad higher. The notable sites which have been cropping up in discussions tend to be smaller e-stores which are using shared, or affiliate information. In an affiliate layout, already not one of search engines favourites, if any one site in the chain adopts bad practices, then the down ranking factors will eventually get to your site as well.

Amid all of the uproar of sites being downed in the rankings or even in some cases, completely lost, there have been some valid suggestions. One of the most basic, and most helpful would likely be that instead of Google hurting anyone for being linked in a bad chain, simply remove any ranking or relevancy of the original, infringing domain. At least then that way not every site down the line gets kicked, and site owners won’t immediately go into panic mode.

Duplicate Content in New Website Creation

When you’ve decided to build yourself a new site, whether it be due to needing an update, or if you’re just looking for a new image there’s a very important step to monitor. You need to ensure, that before you get too far into the process that you’re not making a rookie mistake and allowing the search engines to index both versions of your website. Doing so, can cause you grief and could ulimately penalize both websites for duplicate content.

When you’ve begun working on the newest version of your site, you need to ensure that it’s not being indexed by the search engines so you can work all you like without worry. The simplest way would be to use your htaccess file to block the bots, or alternatively if you have the means, you could work on a local server where the site isn’t techinically on the internet. Duplicate content can cause Google or Bing not to know which page it should list in response to a search. The search engines suddenly have two versions of your website and content to consider, and need to determine which it feels is the most relevant of the two. Seeing as your old site originally had the content, you stand to injure your brands reputation and new url simply by working on a new site or look.

Duplicate content isn’t just a concern when you’re working on your own website, it’s actually a point you should make note to occaisionally monitor. A bothersome trait and a difficult problem to tackle is if your own, original content ends up being scraped by a bot and winds up on an aggregator site. You can search for your own content by searching for key phrases and terms which you’ve used within the content and/or title, and hopefully the only sites which come up are your own or those you’ve given permission too to reproduce it. Typically scraper sites don’t rank that highly in search anymore, however there are still occasions where they do show up higher in the results than the original creators. When this happens, you often become trapped in a terrible cycle of trying to have your own, hard earned content removed from the index, and having credit given where credit is due.