There are a lot of tips out there online about search engine optimization and the methods you can put to use to rank higher in the Google/Bing/Yahoo SERPs. You can find some of the same type of posts on our blog here as well. You’ll find discussion of white hat techniques, black hat techniques, the common steps known as well as some of the not so obvious ones.
What you don’t find very often however, are posts about what not to do, or what to look out for when you’re looking at contracting a company to perform SEO on your website. While the search engines are somewhat flexible in what you’re allowed to do, there are most definately some tricks which can get you black marked, all the way to completely kicked from the SERPs.
So, when you’re looking for a company to perform optimization on your site, keep your ears open for any of the below terms. If there is mention of using any of these practices, it’s time to run for the hills.
Using Cloaked content
This is one of the most common, and most likely to get your company banned, practices out there. For the most part, when you create content for your site you’re telling the search engines what your site is for. Google/Bing/Yahoo then lists the website under the titles and keywords that is found in that content. Cloaking content is when a company shows Google content, and then shows viewers different content such as ads or links to malware infected sites. This is what is cloaking and will get a site removed from Google in very short order.
A lot of blogs talk about how the meta data for keywords and description are defunct, but Google often looks to these as indicators of keywords that make up a site. For example a site about water softeners will often contain content relevant to that industry. Some companies, however, try to gain new content by what is known as “keyword stuffing”. Mainly this involves hiding keywords with single pixel sized font or camouflaging text the same as the background color to try and get listed more often, for more terms. It may seem to work short term, but it will get a site removed from the SERPs.
Duplicate content Websites
Some novice SEOs and SEO companies try to increase rankings by putting the exact same content on different pages on multiple sites. Typically they also use a scraper tool to gather quality content from websites for their own. Search engines have gotten adept at catching this and will happily penalize, a website that has too much duplicate content.
Auto Generating Content
Another poor technique is to use a program to write content for your website. This is exactly as it sounds, taking one article and then having a program rewrite the article by changing a few sentences and keywords over and over again.
Those are only a few of the terms you need to be aware of when speaking with an SEO company. Absolutely stressing the point that if any of the above techniques are mentioned as a tool they use, avoid them at all costs. There is no shortcut to success in online marketing, real SEO takes time and the more time and effort you can put into it, the bigger return on investment you can expect.
The anti-trust hearings versus Google and their supposed stranglehold of the web has been continuing in front of the senate. There are people on all sides of the argument it seems, Google on the defensive, Microsoft and a few others decrying that they’ve been wronged by the search giant. And one of the most basic arguments that Schmidt has used to rebut all of the claims of unfair business could very well win the day. Schmidt’s defence basically says:
“Google faces competition from numerous sources including other general search engines (such as Microsoft’s Bing, Yahoo!, and Blekko); specialized search sites, including travel sites (like Expedia and Travelocity), restaurant reviews (like Yelp), and shopping sites (like Amazon and eBay); social media sites (like Facebook); and mobile applications beyond count, just to name a few.”
Now on one hand, yes Google can provide all of the services that are available on the web, but there are simply better options. If you’re big into social networking, Facebook is still the king, if you travel a lot you use Expedia to find tickets and deals. I’ve personally used Amazon, eBay and Kajiji to post and purchase items and even the smaller search engines like Blekko have their place and a few tricks that Google just can’t do.
So Schmidt’s argument that there are options available online, users just need to navigate to them, is utterly true. Google doesn’t so much have a dominance of the internet, as it has a dominating presence in the search arena. And there are many out there who would point out, Bing, Yahoo and the littls start ups like Blekko which come along, chip away little by little at that armour. Google’s search advantage or position isn’t going to disappear or diminish in any great capacity until a revolutionary game changer makes itself known, just as Larry and Sergei did with Google.
So don’t worry about Google’s “dominating web presence” so much, instead use your keyboard and mouse and investigate the alternatives. Just because one site offers similar products, doesn’t automatically mean you have to use them. After all, you wouldn’t call Coca-Cola to order some Pepsi.
Some would think it foolish to continually bail out a sinking ship, as it springs more and more holes the longer it’s in the water. Yet as strange as a metaphor as that may be, that’s exactly what Microsoft has been doing for the last 4+ years with it’s search engine, first Live and now known as Bing.
Realistically you may initially think that it can’t really be that bad, search is a multi-billion dollar a year industry, Bing and search powered by Bing has about a third of the market so how bad could it really be. Bing search share has been growing slowly, abeit steadily since they rebranded themselves and growth and change has been positive. So how bad is it really? Since 2009, Bing has lost $5.5 billion dollars for Microsoft.
The image itself should put into stark contrast the amount of money that’s been lost in the search game. But Microsoft isn’t without tricks or ideas, and in an interview they hinted at some upcoming surprises.
Microsoft President of Online Services Qi Lu gave an impassioned speech about how Bing would improve search by “reorganizing the Web.” To do that, Microsoft plans to leverage its network of products and partnerships to gain a better understanding of what the user is after when they enter a query into a Bing search box.
And as if to emphasize that Microsoft has recognized the necessity for radical change, Lu also said in the same interview:
Microsoft could not and would not try to “out-Google” Google. Instead, it must “change the game fundamentally.”
If Bing stays the course, even the money analysts are predicting positive cash flow in the next 3-4 years, provided of course Bing continually earns search share from Google. Lu’s statement however, sounds as if Bing is being poised to be sent in a new direction.
comScore has put out the August search numbers, and while it shouldn’t be a surprise Google’s search marketshare is nearly 65 percent while Yahoo and Bing are collectively in the low-30 percent range.
Google had 64.8 percent last month, down from 65.1 percent in July — dropping by a third of a percent. But as Google dipped ever so slightly, Yahoo and Bing picked up its losses. Yahoo’s share grew by 0.2 percent to 16.3 percent from the month before, and Bing rose too by 0.3 percent to 14.7 percent.
While Google drops slightly, the other two in-duo are raising their user base. Considering the Microsoft-Yahoo deal last year, whereby Microsoft handles search queries for Yahoo’s pages, the combined effort alone is beginning to show its mark on the search statistics front.
Americans conducted 19.5 billion total core search queries in August (up 1 percent). Google ranked first with 12.5 billion searches (up 1 percent), followed by Yahoo! with 3.6 billion (up 5 percent) and Microsoft with 2.6 billion (up 1 percent). But as Microsoft is still behind Yahoo by nearly two percentage points, one has to wonder whether all the investment and deals with Bing is even worth it.
When Google launched their first salvo into the social war with Buzz, they made some really big mistakes. Allowing anyone who was on your contact list basically be able to browse your contacts was a pretty big breach of trust for any social network, and it nearly sunk all of Google’s aspirations in one swoop. But fast forward 18 months or so and we’re over a month into their latest social offering with Google+.
They’ve made some serious improvements to their social understanding by watching the explosive growth of Facebook and their flop with Buzz. Privacy controls are easy and intuitive to manipulate, friends are easy to arrange and messaging controls are plain and straight forward. It’s easy to say that Google+ may be a contender in the social arena with hitting 25 million accounts in a fraction of the time that Facebook had, but public understanding and acceptance need to be used to temper their growth. People are beginning to understand the nature of social web sites with Facebook having been the king for so long. Many, myself included, find they have as much as entire friend feeds blocked as all they do is play Farmville or Cafe World. Facebook boasts having high day to day activity and retention rates, but if the majority of those people are just there to play games the quality of the use is definitely in question.
But just like Google’s AdSense and paid advertising you see on results pages, those game players on Facebook are served ads. Social Media Marketing is a very real avenue to explore if your a small company on a tight budget. Google+ at present doesn’t have business options setup, but they’ve made clear that yes, they are coming. So get your practice in with Facebook, Twitter tweets and PPC/AdSense marketing because even with a “paltry” 25 million users, Google+ will be a qualified market for advertising.
All of the taglines you generate with Twitter, Facebook and soon with Google+, may have more strength than you might think. Nicholas Schiefer recently won a Canada Wide science fair and made interesting inroads in the realm of search. The 17 year old is being compared to Mark Zuckerberg for his idea and implementation of his search algorithm, and those are no small shoes to fill.
The algorithm as it’s written, searches short documents like tweets, Facebook statuses and news headlines for starters. That 140 character string of gold is crunched and parsed by his infant algorithm to deliver results. It may not seem much different from what Google, Bing or Yahoo offer, but where it does get different is when his search algorithm applies context to the results. The advantages of a semantic algorithm which could determine context in the results it retrieves would be a great improvement in the realm of social search. As an example, you’ve been out for dinner and had a poor experience, you could use that type of search engine to determine if others have had the same experience. It’s possible to do so with the existing search engines, but it takes a bit of work to sort through the results to find customer reviews if you don’t include it as part of your initial search. It’s an impressive start for a young man who may be a part of changing the way the world searches. Time will tell how interested the world is in semantic, contextual searching should Mr. Schiefer continue his project.
So in the world of search there’s a handful of true search engines, those little boxes of which you type in your current question or conundrum and off you go into the wild internet. We have Bing, which holds onto somewhere around 27% or so of the search market, Google who holds onto the lions share of search at just over 65%, and all those little crumbs in the bottom are search engines like Ask.com etc.
It’s not difficult to find press about how Bing is making massive inroads into Googles share of search, or how last year Bing grew by over 90%.. blah blah blah. When you boil the numbers all the way down however, all you’re really left with is Google and Bing, and the only way Bing is going to make positive growth in search is to take it from Google. So using misleading titles to the tune of Bing overtaking Google, or Bing Grows 90% over the year are nearly wholely misleading. Even with all of this “incredible growth”, with all of the addins and marketing strategies Microsoft throws at Bing they’re left with a fairly large problem. Despite owning more than 25% of the worlds search volume, Bing doesn’t make any money for Microsoft.
That may not seem like it makes any sense, but look at it from a different perspective, try and see it from the advertising angle of things. The sole product sold by search engines are the advertisements that appear on search pages, which are sold not for a set amount, but based on how many times customers click on an ad tied to the search phrase that brought the user to the page. And since Google has such a huge search market share, they’re rolling in cash right from the start because of their cost per click for their adword programs. Now the one biggest reason Bing doesn’t make money, isn’t because they have a smaller search share than Google alone, as it turns out, the cost per click tied to their advertising model is as much as 1/5 the cost of Googles cost. As bad as that may sound as a revenue model, it actually gets a little worse for the Bing machine. Less CPC looks great on the surface, but as an advertiser it brings up the issue of what is driving that low cost. Bing has less traffic than Google at the outset, the CPC to serve the same ad on Bing is cheaper than Google and in the end it translates into less ad impressions on the Microsoft search engine.
So the question in the end really, is there ever really going to be a solid competitor to the Google machine? If a multi-billion dollar a year company can’t even step into the same arena as the giant and succeed, who truly can? I say bring them all on, competition is what made the web what it is today, more will only make it better.
So if you’ve been tracking your sites progress on Googles search results pages, and you noticed some funny movement in the last week or so, you’re not imagining things. Google came out with it finally and admitted, yes they’ve had another regular update, but with Panda as part of the equation this time. Some have noticed that their sites have shifted a half dozen places or so, and some have noticed that for some of their optimized terms they’ve just completely disappeared.
As shocking and distrubing as it may be to suddenly find you’re not in the results where you were in the previous weeks, you may want to hold off on that complete site revamp to address your disappearance. To put it another way, Google took their search index, full of billions and billions of terms, tossed it up in the air and all of the websites are still coming down. Being filtered into all of their most relevant terms based on the current algorithm, it’s safe to wait just a few more days to see what happens through the weekend.
Google and +1
So search, it’s a funny game, moving, shifting, always changing. Facebook has their ‘Like’ button, which Bing has added their own special metric and weight to. And Google has their newer +1 button which they’ve come out and said basically ‘Yes it’s good for you to have on your site along side the Like button’. Basic fact though, the implementation of the +1 button on your site was actually bogging it down as of late, cutting your performance in half by almost half in some extreme cases.
While the Facebook ‘Like’ button is a flat blue color, the +1 button is a script or two which glows and stands out from your web pages. Definitely a hindrance to performance conscientious site owners, it wasn’t long until another disturbing trend was noticed. Visitors to pages with the +1 button, were slowly and steadily dropping. Almost strangely and on cue, Google has released a new version of their +1 button, faster, sleeker and much more in line with current web speed standards.
And just like the Facebook button, and those scandalous people making a living selling their browser clicks. It seems that because the +1 button can have a positive effect on your search ranking, some of the less scrupulous SEO companies out there are now selling their clicks. It’s not much of a stretch or a surprise really, as there are grey SEOs to be found all over the web selling all manner of SEO tricks. Selling links, scraping and rewriting content for you, Facebook ‘Like’ sellers and now +1 sellers. Just cut the SEO juice from the button and it’s true use will emerge, content promotion because it’s genuinely good content.
Unless you’re a member of the tinfoil hat group, you’ve undoubtedly used the internet and a search engine at some point in the last few days. You may have used Bing, maybe Google, but you had that need for information. Irregardless of which search engine you decided to partake in, you made your choices based on what you learned. But if you’ve ever been curious, ever taken the time, the results from Bing and Google can sometimes completely differ for the exact same search.
Effective searching is, strangely enough, a skill that everyone who is online should have, yet few do. It’s actually difficult sometimes to explain to clients, both existing and prospective, that the more complicated you make search in your head, the more frustrating your SEO campaign will be to you. The first problem as a business and website owner that you need to overcome, is the idea that when people search for you online, they use niche or specialized terms as they work. Unfortunately however, this is where things begin to go over the top in complication. If you own a website and business which fixes vacuums, then it’s in your best interest to optimize and build your site around that theme. The wrong approach to take would be to try and optimize your site around all of the different brands you deal with, instead of using an all encompassing term.
Different search engines display their results differently as well, and you’ll show up for different terms in them. Some of the points which will influence where, when and how you appear are things like your content, your url structure can even influence your positioning some what as can the lack of content. There’s no such thing as too much content, provided of course it’s relevant to your business and website. Keep it simple, don’t overthink it, and before you know it you’ll be showing up in the SERPs for all sorts of terms and phrases relevant to your business.
When you logon to your computer, fire up your browser and start your internet trek for knowledge, entertainment or what ever it is that has your mind occupied, are you going to be able to find your answer? It’s a question which has been gaining more and more traction in the last year or so, and DuckDuckGo, a new start up search engine has been shaking the search cage in an effort to forge it’s own path.
Recently they have put up a page detailing how when you perform a search on Google, Bing or Yahoo, you’re not getting a true results page. The screen shot of the search results clearly shows that different people will receive different results searching for ‘Egypt’ as a search term. Without reading the link text, it’s clear that the results pages are vastly different. But why are they different comes down to dozens, if not hundreds of different reasons. It can be as simple as your location in the country, the time of day or the trend in the news lately. The short pictorial provided on the DuckDuckGo page details essentially how search engines, Facebook, Twitter etc are all delivering pre-packaged results based on your web usage and they also contend that this shouldn’t be happening.
DuckDuckGo is a search engine which doesn’t save your search results, doesn’t pass your search terms onto referred websites, has a nifty red box they call zero click info (handled by Wolfram Alpha) which appears on some searches and after all that, is throwing their hat into the search engine ring. Being a new player at an old game is a tough market to break into, and DuckDuckGo is performing search in a way that is attempting to deliver a filtered *and* unfiltered internet. It’s a noble idea and does have some merit if you’d like to perform somewhat private searches on sensitive matters it may be an alternative for you. Google Chrome and Internet Explorer however both offer a cookieless browser which accomplishes the same result so you don’t really have to give up the engine you know and are familiar with.
The only real way to test if you genuinely live in a “search bubble” is to perform the same search, with 0 clicks on multiple computers. If you begin seeing that your results are significantly different than other peoples then perhaps you have a case. Personally after viewing the screenshots, when you look closely at the how many pages were fetched for each search term, there are tens of millions of pages of difference, so of course the results are going to be different. Part of Google, Bing and Yahoo’s success comes from the fact that they pass some search data to the referred website in the form of the search term, it’s what enabled the search engines to build their ad programs for web users. There are dozens of different variables when you receive your search results after you click that search button and even a simple variable like which data center sends you your results influences your page. If it happens to be running with an index which is a few hours older than others, you can very easily get different results when performing the same search multiple times.
The founder of the newer engine DuckDuckGo has recently discovered that he’s being hit with tons of spam queries for all sorts of seemingly random searches. He’s made note of the fact that while he can block these botnets from spamming his servers for the same query over and over and over again, he’s formed a question about this traffic.
In his own words from his blog:
“if other search engines include this errant traffic in their query counts. We work hard to keep them completely out because they would overwhelm our real direct queries #s and therefore distort our perception of progress. “
And while Gabriel makes a solid point and brings up a great question as to the quality of the searches and query numbers being generated, I think he’s missing the simplest answer. The founder of DuckDuckGo has managed to block the botnets at the firewall level to prevent them from skewing the query numbers and influencing the search numbers. And being that the other search engines, Google, Bing and Yahoo respectively, have been around far longer, it would lead to the assumption that they’ve already dealt with the issue about the false searches. As far as SEO is concerned, this kind of activity can be seen as a quality spam, as it can be seen as bots that the websites in question have received hundreds of thousands of queries and results from these malicious users. A game and method which was dealt with years ago by both Google and Bing, so it’s almost completely a non issue.
I think the more realistic reasoning behind the botnet traffic on the new search engine is a very simple problem that anyone with a website that has an input box and no validation can relate to. It’s just spam, either looking for an exploit or a kink in the code to be able to exploit the website software that’s been picked up. It’s argued that the small search engines like Blekko and DuckDuckGo offer a better quality of search due to the fact that they are smaller and less bloated than their big brothers. In time however, I can see it being realized that the larger and larger these small engines become, the more increasingly difficult it will be to deliver incredibly fast results (less than half a second) while maintaining a complex directory of hundreds of billions of pages. Google just last year reached the 3 trillion pages indexed mark, a number which would cripple most data centers in existence.