Republican Rick Santorum has had a few.. issues over the course of his public running for the head of the US office. All politics and personal opinions aside, Santorum has run into the most recent iteration of what used to be known as Google bombing.
A few years back if you visited Google and searched for the term ‘miserable failure’ the first result used to be the then president of the US, George Bush. Santorum unfortunately hasn’t been so lucky with what he’s become associated with, but the methods as to why he’s momentarily stuck there is due, in part, to how SEO works.
The bare basic principles of SEO – you create content, you contact people to link to it and repeat over and over again until you’re the king of the mountain. Backlinks, the fuel for SEO is what keeps you flying high in the rankings. An extremely popular social community, Reddit, has decided that the politician will remain associated with this alternate meaning for the near future at the very least.
You can think of the woes that he is experiencing as you will, on a technical note however it needs examination and repeating. Until Google, Bing and Yahoo outright remove a website from it’s index, it can and will continue to rank for the term(s) with which it’s been written to be found for. The act of completely removing a site from the index is an extreme measure that the search engines rarely wield, let alone with permanency. It needs to be noted that irregardless of the poor quality of the links, the relevance and sheer number linking to the offending website are what is keeping the politician tied to his less than stellar ‘definition’. To quickly deal with the issue the search engines could enforce a nofollow on the Reddit site but Google in particular deems itself a color blind indexing machine. Mr. Santorum would do well to contract the skills of a search engine optimization expert to assist with his character control.
A question often asked in the course of a discussion with a client, whether new or existing, is how do we determine our costs as a search engine optimization company. It’s not a single, or simple answer as each clients needs are unique as anyone fully involved in the industry will tell you.
One of the factors which determines your cost for your SEO, is what types of keyterms you desire to optimize your website for. If you own a company which manufactures toys and you’re the new kid on the block, being able to drive your website up in the SERPs is going to be difficult. Doing just a generic search for the term ‘toys’ returns 1.9b (yes, billion) pages. Granted that’s a rather generic term to try and optimize for, but each term is met with it’s own challenges to overcome where keywords are concerned. The number of terms you’d wish to be optimized for also lend to the maintenance cost of your website.
Another determining factor that affects your overall cost is the overall quality of your website and it’s content. If we need to sit down with you and assist you in rewriting each and every page due to lack of quality content, it’s a necessary step which needs to be in place before we even begin to think about scouring the web looking for back links for your site. As well, if your website is full of choppy code which needs to be addressed, or even if your website is so woefully out of date that a complete rebuild is in order, these also contribute to the overall costs.
These are only a couple of the factors which contribute to your companies online advertising budget, thankfully however if you need to rebuild your website or rewrite your content those are often one time costs. The consistent maintenance which needs to be addressed to continue ranking well in the SERPs however, is where the majority of your budget needs to be directed. And as the web becomes more and more competitive, that budget will need to be adjusted every 12 months or so.
Did Bing play dirty over the shopping holidays? If you tried at all this most recent Cyber Monday to use the Bing search engine, the signs currently point to yes, they did play dirty with their results.
The creators of the idea of Cyber Monday, found themselves lost in Bings search listings because according to Bing their content was too “thin”. If the term is familiar, it’s because it sounds a lot like Google-speak when they started rolling out the infamous Panda updates and culling “thin content” based websites from their index. A difference to note however, Panda didn’t actually remove the offenders from the index, it just meant the odds of those sites ranking well plummeted.
Back now to Bings version of taking care of thin content and removing websites which fall into this category. Cyber Monday is now a billion dollar online shopping event, where website owners have the opportunity to make some good money heading into the holiday shopping weeks. If a site which could promise and deliver strong referrals could rank well, they would also stand to make a fair bit of change. Shop.org came up with the term Cyber Monday in ’05 and a year later created the corresponding website, cybermonday.com. This past Cyber Monday Google had the website in their SERPs, while Bing did not. Bing did however, have their shopping channel listed at the top of their results for searching cyber monday.
Bing has stated previously that they will dispense internet justice on sites deemed unworthy to be listed as part of their SERPs, but completely removing any and all traces of a site? Bing defines spam as:
Some pages captured in our index turn out to be pages of little or no value to users and may also have characteristics that artificially manipulate the way search and advertising systems work in order to distort their relevance relative to pages that offer more relevant information. Some of these pages include only advertisements and/or links to other websites that contain mostly ads, and no or only superficial content relevant to the subject of the search. To improve the search experience for consumers and deliver more relevant content, we might remove such pages from the index altogether, or adjust our algorithms to prioritize more useful and relevant pages in result sets.
So by removing the cybermonday.com website, if Bing were to stick to their guidelines they should remove all “thin” websites which fell under the same blanket. Yet they did not entirely and websites which feature almost identical content to the cybermonday.com website still appeared in their results. To further muddy the waters, the Bing powered search results which were served up in Yahoo would turn up Black Friday “websites” which would be deemed even thinner than the Cyber Monday website. With all the fuss that Bing was putting up about Google favoring their own results over all others, this sure doesn’t look well on the Bing radar. The Panda updates may drop websites rank if they’re found as being too thin a website, but at least they’re not completely removing them from the index ala Bing.
I had an odd occurence recently in terms of how search is evolving and it involved a rogue browser extension. It’s mildly annoying when you have a toolbar become installed in your browser of choice, but it’s frustrating when it’s installed without your expressed knowledge, say by having the install clause buried in a EULA for another program.
The rogue extension in question was Surf Canyon, a real time search reorganizer would be the short description. With the internet being comprised of literally trillions of web pages, search engines like Google, Bing and Yahoo are the big hitters in locating what you need online. They all offer their own pros and cons, Google is the weapon of choice for the majority of searchers out there for the past 10+ years.
Real time search results have become a challenge for all of the search players, with everyone working to get a solid solution to serving up relevant results which compliment the current organic offerings. The idea behind Surf Canyon extension is that it personalizes the web for you as you search. A fine enough idea, what was actually noticed however was the extension has somewhat a mind of it’s own.
Toolbars are a nuisance in a browser, fake links on webpages are a pain as you don’t really know what’s real and what isn’t without clicking. But a browser extension which supplants false links into webpages which you know have no outgoing links? That’s poor business practice and sketchy access to a computer and browser history.
Is Bing more biased than Google when it comes to the results pages? In research that has been gaining traction as of late, the answer seems to be yes. It wasn’t a directed study on a few select terms either, it was a large random sampling of the SERPs conducted by a professor at George Mason University.
What he found in the tests that he conducted was that for the most part, Bing will favor Microsoft content more often and more prominently, than Google favors its own content. According to the findings, Google references its own content in its first results position in just 6.7% of queries, while Bing provides search result links to Microsoft content more than twice as often (14.3%). The percentages may seem small, but when you consider there are billions of searches performed daily, suddenly 14% isn’t such a small number.
The findings also cast a different light on the recent FTC antitrust complaints which Google has been handling surrounding anti-competitive behaviour. It’s also a stark contrast to a similar study done earlier in the year, which concluded that “Google intentionally places its results first.” So now as a user with two completely different data points, which is the set to believe?
Well the second study which has been conducted had two goals in mind : To replicate the findings of the first study and to also expand on the methods used to determine if it was perhaps an issue in how the results came about. From the very beginning, it was found that while Google does favor its own content at some points, the selection of terms is exceedingly small. What was also learned and wasn’t mentioned in the first study, Bing does precisely the same in preferring Microsoft results, but for a much wider range of terms than Google does and is much more likely to do so. “For example, in our replication of Edelman & Lockwood, Google refers to its own content in its first page of results when its rivals do not for only 7.9% of the queries, whereas Bing does so nearly twice as often (13.2%)”
As for the second part of the study, the study used a much larger, more random sampling of search queries as opposed to the only 32 samples that the first study used to portray Google as the big bad guy of search. And the findings of the second study were related in the beginning of the post; Google references own content in its first results position when no other engine does in just 6.7% of queries, while Bing does so over twice as often (14.3%).
So, what does this mean as an end user?: Google (and Bing, though less so) really are trying to deliver the best results possible, regardless of whether they come from their own services (local search, product search, etc) or not. It all comes down to preference.
There are many steps which are part of a successful organic SEO campaign. There’s all of the little steps like writing good content, making sure you have the titles and meta tags in place and having a menu which is comprehensive. When you’re finished with the good practices pages, you begin to read about one of the time intensive steps of the campaign, link building.
Since Panda has reared it’s head over the last year or so, there’s been chatter about how the SEO game has fundamentally changed. That scrapers and content aggregators, the black hatters and the link buyers would just disappear and we’d have pristine, precise results. Time has started play it’s part and while the scrapers, aggregators, black hatters and link buyers have mostly been swept away, there has recently been a new call to revamp the way the system has been working. The desire to change the link building metric portion of the search game sometimes comes up in discussion as the points for and against the practice are argued. When you break it all down to the basic points, primarily every search engine will tell you the same thing: content is king. If you produce quality, relevant content, you will rank in the SERPs.
The kicker about producing this kind of content however, is you will naturally receive back links to your site and it’s pages. When you’re a new site and you need to visit and email possible consumers and possible partners in the same niches, building those back links takes time. But they will be built, they will be taken as a metric by the search engines and until an algorithm can come along which can read and evaluate content as a user would, link building will be relevant. It will be an important portion of any and every organic SEO campaign no matter how big or how small. The success of your link building campaign can be directly tied to how much work you’re willing to put into contacting those who are in an industry which compliments your own.
It’s reassuring, that even though some businesses out there are slow to improve their websites or their online marketing toolset, the trend is slowly but surely shifting. While still only a fraction of the marketing dollars spent out there, the numbers are showing that around 17% of most businesses marketing budgests are being spent on online marketing. Any positive growth is good for everyone involved.
A great graphic depicting some of these changes has been put together, which outlines some of the changes coming about in the marketing world. In the US, 70% of the businesses out there have indicated that they will be increasing spending on social media advertising (Facebook, Twitter, Google+) and 64% also chimed in to add their budget is increasing for SEO as well. With consumers spending more and more time searching online for their next purchase, it’s much more advantageous to get into the game now, as opposed to later. The longer you wait, the greater your costs are going to be. Surprisingly however, it came back that 17% of businesses out there planned on increasing their marketing budgets on print media, which is much like buying stock in Yahoo these days. I kid, I kid, all jokes aside however, almost anyone out there who has a job has access to the internet. It should be no surprise that on average people spend 3+ hours browsing the internet. 84% of people who use the internet, spend their time searching for information on what has caught their interest, there are billions of searches per day.
There’s a great deal more information which can be gleaned from the stats, have a look and take a moment to conisder your marketing plans. Are you on the side of innovation and forward thinking? Or trying to cling to an outdated, unmeasurable stand by. Just remember that the longer you wait, the more difficult the game becomes.
So remember a little while back when Google decided to try the whole social thing and launched Buzz? And it cost them a few million because they “oops forgot privacy”? Well the FTC has finally decided how to handle the giant and it will throw a bone at the privacy concerned members of the public to boot.
For the next 20 years, Google will be subject to privacy monitoring from the U.S. Federal Trade Commission. By using what are being called “deceptive practices”, the FTC will babysit the search giant thanks to its now dead Google Buzz social networking service. The investigation into Google’s privacy practices began after users complained that their Gmail contacts were made public, and that the steps to protect their privacy weren’t clear or effective.
When the service launched, Gmail users were given the option to participate in Buzz, but what Google failed to mention was that the people they email most often would be listed publicly. Those users that declined to participate were also automatically enrolled into at least some of the Buzz features without their consent, according to the FTC investigation.
“In response to the Buzz launch, Google received thousands of complaints from consumers who were concerned about public disclosure of their email contacts which included, in some cases, ex-spouses, patients, students, employers, or competitors,” the agency said. “The FTC charged that Google failed to disclose adequately that consumers’ frequent email contacts would become public by default.”
The FTC also added that Google had misled the public in regards to its privacy policies, and misrepresented its compliance with U.S. and E.U. Safe Harbor “or other privacy, security, or compliance programs.” So for the next 20 years Google is going to have a monkey on it’s back with the FTC being able to watch their every social change and if the search giant isn’t careful, may find itself back in hot water.
It’s been just around a month now that Google+ became open for business, and Google remains undaunted in its effort to go toe-to-toe with Facebook.
Vic Gundotra, vice president in charge of Google+ said, “We are in an enviable position that we have people who come to Google, we are in this for the long haul… By Christmas you will see Google+ strategy coming together.”
Google+ has attracted more than 40 million users since it opened to the public , but has a long way to catch up with Facebook’s membership of approximately 800 million.
Google is looking at tying all of their current Apps and extensions into Google+ accounts, the goal being able to synch the whole mess together with Docs, Youtube etc. Eventually, Google aims to open the platform to outside developers to make games and other kinds of installable “apps” that have been part of Facebook’s success.
Google is moving slowly and cautiously to make sure its social network is a safe, stable haven for families, friends, and other associates who connect with one another in “circles” created at the service.
Gundotra acknowledged that Facebook has the advantage of a “network affect,” in that complex webs of friends are established there and people might find it daunting to up and relocate to Google+.
“The incumbent (Facebook) has a huge advantage, if you play the same game, you are not going to win… So we are going to do it differently.”
One of the larger contrasts between the two networks, Google+ offers much more discretion on what you share, with whom.
“We do not believe in over-sharing,” Gundotra said. “There is a reason why every thought in your head does not come out your mouth… We think a core attribute to being human is to curate.”
Google+ launched with a requirement that people use their real names online in order to let others find them more easily, but they are aiming to eventually allow people to use pseudonyms on your account as opposed to your real name. It’s been a thorn in the fledgling social network since early in it’s beta incarnation.
“We wanted this to be a product where you can discover people you know,” Gundotra said. “You don’t know ‘Captain Crunch’ or ‘Dog Fart’.”
Based on the rest of the discussion from the conference, it’s looking like Google can’t wait for Christmas to get here.
An issue that some small business owners who try and handle all aspects of their marketing campaigns encounter, is time. Especially when it comes to the online marketing aspect of the business, being able to consistently put time and effort into your website is a crucial element.
There are a number of small business owners out there who are self-made, done it all on their own types so it’s only logical (in their mind) that they can handle all of their site maintenance as well. What’s quickly and unfortunately learned, is slapping quick fixes and changes which you may read in a forum or blog can be a hindrance to your online marketing efforts.
Another common misconception surrounding the online advertising industry is it’s a one shot done deal. The truth of the online world is completely different however, as it never sleeps, stops or even slows down. When you’ve broken down your site and completed all of the changes and updates to your content to help your positioning, no sooner have you finished than you need to start all over again with your competitors.
So when you realize, as a business owner, that you don’t have the time, the knowledge or the expertise to help your business rank at the top of the SERPs, it’s time to call in the experts. The most dangerous thing you can do to your website is trying to interpret search signals and make changes to your site based on speculation in a forum or a blog. Information taken out of context can bury your site if you’re not careful.