The myths surrounding SEO are many, everything ranging from what the algorithm contains, how to trick the engines to rank highly with no effort and everything in between. Just like all rumors, they have a beginning, and it seems someone is trying to start a new on on the Webmaster World forums.
A site owner who has ended up being ‘Penguinized’ on his site as he put it, has become overly paranoid about any and all content on his site. He has basically decided, that all user generated content is a potential red flag for spam on his site, and as a result had currently removed/disabled all of the content. The (potential) birth of the SEO myth that user generated content, comments, forums, or other ways to directly interact with your customers, can lead to a Google applying a penalty to your site.
Without any confusion, user generated content will notlead to any penalty to be levied against your site.
It’s topics like this one started on active forums and blogs that lead to a great deal of confusion in the search world. It might seem like an inoccuous discussion taking place in a proper forum by someone looking for information, but with the way the discussion was handled it has the potential to lead to long lasting repurcussions. Because what often ends up happening is someone new to the search world finds these posts, and begins to believe them and the myth continues, changes, and unfortunately grows.
And finally after being patient for the last few months, site owners with a Google webmaster account have the final say over how links to your site are treated. From the Google webmaster blog:
Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue.
This is going to be a great tool to add into your toolkit if you use webmaster tools directly, and if you don’t you should check that your site manager is keen on what the tool can actually do for you. A quick rundown of how links to your site affect you – you create content, and if it’s unique content that is relevant to your niche then users will generate a link to that page. These links are used as a factor when determining relevance in the results pages for the terms you may wish to rank for, and if you’ve created great content then the links will follow. More links is used as a measure of relevance, so the more the better. Well there’s a downside to links and that happens if you have too many ‘unnatural’ links pointing to your site. That would be having links from a plumbing site pointing back to your website on shoe sales, the two topics are irrelevant to each other. The recourse you had as a site owner in this instance was to contact the website that posted the link to your site and asked to have it removed, it was then out of your hands and left for them to deal with, and until it was you could be handed a stiff penalty from search engines.
The problem with that scenario is after you’ve notified the site owner to remove your link, you no longer had control of what happens next. But with the addition of the disavow tool in Google, you can now take matters into your own hands and manage the backlinks coming into your site. it’s a great step in cleaning up the web and improving the relevance of the search results overall. You can find out more about the disavow tool here.
Google has always had the spotlight when it comes to search since it revolutionized the way users access the web. It’s grown to a point where in the last year they consolidated all of the privacy clauses into one, giant blanket one that affected all of their online properties.
An example of moving forward with search, I’ve mentioned a handful of times in the blog, is the DuckDuckGo search engine. Recently the small search company produced a video where they talked about how Google has each use caught in what they called a search bubble. Where they took more than 100 users, ensured they were not signed into their Google accounts and had them conduct searches on specific terms and captured their results.
What they found, was that even when the users were not signed into their accounts, and even in the same geographical area, they received differing results pages. It’s not a revelatory video really, as Google isn’t the only company on the web that utilizes browser cookies to determine who a user is and what they may like. Not to discredit what DuckDuckGo is hinting at, but with such a small sampling, and by allowing users to use their personal computers without clearing any session cookies, it’s no wonder the results were different for each user. Perhaps with the addition of a control group, a group of 20 users or so who were using completely clean installs of a browser and OS would help balance their results.
The numbers for the past month in search came out, and while seeing Google on the top for the majority share, what was somewhat disheartening was the continuing slide of the Yahoo position.
There’s been no real shift in the overall numbers, Google is still sitting at just over 2/3 of the search share, and Bing is following up with just 15% share. Yahoo slipped even further than the previous months, to just around 12%, giving the combined search engine just shy of 30% of the market. Yahoo was one of the primary search engines and one of the first to roll out a paid search marketing platform so to see them slip further out of the limelight of relevancy. Change however, is inevitable and is always a good thing for all parties involved.
The search share numbers aren’t terribly surprising in the grand scheme of things, and perhaps it was the additons of Panda and Penguin to the equation, but the number crunchers are at it again. On the webmaster forums there is discussion going on what the current algorithm may contain and how it might use analytics to help rank the sites in the index.
Some interesting theories are coming out of the discussion, mainly because no one outside of Google really knows the process for ranking the sites within the index. Google has mentioned previously that they don’t use any search data from their Chrome browser, and the running theory so far is the idea that the search giant is using click data from ISPs. In the end it’s only the team at Google who really knows how the engine ranks it’s results.
There’s been a small surge of malware reports coming from the searches via Bing and Google, which really isn’t news in and of itself as they’re always buried within the results somewhere. But what is different, is that more than 90% of image results were found to be malware related on some terms.
The most targeted term this time around happens to be “Emma Watson”, whom McAfee has named their most dangerous celebrity search of 2012. Of the two engines, 30% of Googles searches had malware warnings attached and more than double that came out of Bings results. Malware take overs happen in a couple of different ways, one of the most frequent are websites built with little to no security written in, and then there are throw away websites and urls used purely for the spread of malware on the web.
Black hat SEOs typically go after the hottest search terms and poke around the web looking for websites which have loop holes in it’s security. They actively work to hijack the website and it’s url, to help lend false authority to what ever term they’re wanting to spam. And because uneducated or hasty users tend to automatically trust the top results in the search engines, the spread of malware will continue.
Because of the recent discoveries that image results are getting slammed with malicious results, where the text results pages are beginning to be left behind, Bing has been unofficially dubbed (currently) the most poisonous search engine. The only reason that the moniker has been attached to the search service is due to the recent report about malicious websites being targeted at image searches now as opposed to the text results pages. Not to fret however, as Bing and Google will take steps to close those holes which have been opened in the image results, and in the meantime just be a little more cautious before clicking that top image of your favorite star.
Google is getting into the credit business for the first time, with the launch on Monday of a programme in the UK to finance purchases of its online advertising by businesses.
The move marks the opening of a new front in the battle between the biggest internet companies, as they turn to their balance sheets as a source of competitive advantage. Amazon said last week that it had begun making loans to independent sellers that offer their products on its marketplace, marking the online retailer’s first move into financial services.
Google’s decision to issue its own credit card, which will also be made available in the US within weeks and other unspecified countries later, signals the company’s first attempt to use its huge cash reserves to support its core search advertising business by subsidising low-interest rate credit lines.
It said it would offer customers credit of between $200 and $100,000 a month to pay for their use of Adwords, which places messages next to the results in its search engine and made up the bulk of its $37bn in advertising revenues last year.
Read full story here
So they worked together until the 2011 fallout when twitter supplied real time results to Google, now twitter is getting more into SEO, who said it was dead?
After changing it’s robot.txt file some weeks back, twitter has now let the search engines, Google, Bing and others checkout there user profile directory, basically a sitemap of all the users, this of course will help people find the accounts they’re looking for with various search engines.
According to reports Google has indexed 718,000 matching results, Bing with it’s renowned slower bot has only got the directory home page at present but will surely get the others sometime soon.
So SEO still lives on, well in the eyes of social media sites anyway.
Came across this great infographic by Aaron Wall at SEOBook, what I found most interesting was the deluded people mentioned, here at Fresh Traffic we have been coming across people like this from day one of the internet. The truth of the matter is that most people who say or mention this are on the list, why? simple they cannot do it.
Click to enlarge to pdf version
It’s a honor to welcome another Brit to Winnipeg, Sir Richard Branson arrived to champion a pilot program to put cellular phones in the hands of homeless and at-risk youth, called RAY. Branson unveiled “Phones for Change” Thursday through Virgin Mobile Canada and Virgin Unite, two subsidiaries of the Virgin Group, an international giant that he founded.
When we talk about Branding here at Fresh Traffic here is company that has done it better than most, Richard born in Blackheath, London, His first business venture was a magazine called Student at the age of 16.In 1970, he set up an audio-record mail-order business. In 1972, he opened a chain of record stores, Virgin Records, later known as Virgin Megastores. Branson’s Virgin brand grew rapidly during the 1980s, as he set up Virgin Atlantic Airways and expanded the Virgin Records music label.
I have had the privilege of working for one of the virgin brands online and had a few nights out with Richard in the past, a true gentleman with no heirs and a straight talking intelligent man.
So Rich, welcome to Winnipeg, have a great few days with us and no doubt we will run in to each other very soon.
Richards Winnipeg Chamber Video
There’s merit to the web search team and the tidbits of news they put out everyday. Sometimes they talk about the changes to the algorithm, and about how to expect a shift in the search rankings or placements.
It’s not uncommon for sweeping changes to be made to the web which leaves old code irrelevant, if every little bit was always left as part of the formula, then the search algorithms wouldn’t be the few hundred factors they are now, it would be in the thousands. The search industry would be substantially slower moving, both in use and as a business. Building a website would be an absolute nightmare if you still had to worry about table construction for a half dozen different browsers, and if the flash menu you’ve built with fancy fly away modules would even start up in others. Thankfully, none of that is an issue, as old coding techniques get replaced with much more up to date methods.
One of the basics of web development, that we’ve always made sure to mention to web developers and clients, is to make sure they utilize the metas keywords and description tag. For a number of years, the search engine optimization industry has said that the tags no longer bear any relevance or importance, and that the engines themselves will more often than not choose which they like to use anyways. Putting time and effort into even writing the few lines of code it takes to put them in, is a few lines too many. Fair enough, everyone is entitled to their own opinion, I do offer an example of what can happen when you let the engines decide what to publish as your websites description in search.
SEARCH. <div> <div > </div> <div> <div templateType=”C1″> <ul role=”listbox” class=””> <li role=”menuitem”> headline </li>
Just a couple of points to note, this is from a multi-million dollar company, with strong rankings in search and in traditional media, and this is the description that was pulled to use in the results page. This mix up won’t hurt their positioning online, or offline, but it proves that skipping the basics isn’t always a good idea. Skipping adding a description tag on the premise that it’s an out dated step, would be the end for a new site online with that type of description pulled.