Google has always had the spotlight when it comes to search since it revolutionized the way users access the web. It’s grown to a point where in the last year they consolidated all of the privacy clauses into one, giant blanket one that affected all of their online properties.
An example of moving forward with search, I’ve mentioned a handful of times in the blog, is the DuckDuckGo search engine. Recently the small search company produced a video where they talked about how Google has each use caught in what they called a search bubble. Where they took more than 100 users, ensured they were not signed into their Google accounts and had them conduct searches on specific terms and captured their results.
What they found, was that even when the users were not signed into their accounts, and even in the same geographical area, they received differing results pages. It’s not a revelatory video really, as Google isn’t the only company on the web that utilizes browser cookies to determine who a user is and what they may like. Not to discredit what DuckDuckGo is hinting at, but with such a small sampling, and by allowing users to use their personal computers without clearing any session cookies, it’s no wonder the results were different for each user. Perhaps with the addition of a control group, a group of 20 users or so who were using completely clean installs of a browser and OS would help balance their results.
The numbers for the past month in search came out, and while seeing Google on the top for the majority share, what was somewhat disheartening was the continuing slide of the Yahoo position.
There’s been no real shift in the overall numbers, Google is still sitting at just over 2/3 of the search share, and Bing is following up with just 15% share. Yahoo slipped even further than the previous months, to just around 12%, giving the combined search engine just shy of 30% of the market. Yahoo was one of the primary search engines and one of the first to roll out a paid search marketing platform so to see them slip further out of the limelight of relevancy. Change however, is inevitable and is always a good thing for all parties involved.
The search share numbers aren’t terribly surprising in the grand scheme of things, and perhaps it was the additons of Panda and Penguin to the equation, but the number crunchers are at it again. On the webmaster forums there is discussion going on what the current algorithm may contain and how it might use analytics to help rank the sites in the index.
Some interesting theories are coming out of the discussion, mainly because no one outside of Google really knows the process for ranking the sites within the index. Google has mentioned previously that they don’t use any search data from their Chrome browser, and the running theory so far is the idea that the search giant is using click data from ISPs. In the end it’s only the team at Google who really knows how the engine ranks it’s results.
There’s been a small surge of malware reports coming from the searches via Bing and Google, which really isn’t news in and of itself as they’re always buried within the results somewhere. But what is different, is that more than 90% of image results were found to be malware related on some terms.
The most targeted term this time around happens to be “Emma Watson”, whom McAfee has named their most dangerous celebrity search of 2012. Of the two engines, 30% of Googles searches had malware warnings attached and more than double that came out of Bings results. Malware take overs happen in a couple of different ways, one of the most frequent are websites built with little to no security written in, and then there are throw away websites and urls used purely for the spread of malware on the web.
Black hat SEOs typically go after the hottest search terms and poke around the web looking for websites which have loop holes in it’s security. They actively work to hijack the website and it’s url, to help lend false authority to what ever term they’re wanting to spam. And because uneducated or hasty users tend to automatically trust the top results in the search engines, the spread of malware will continue.
Because of the recent discoveries that image results are getting slammed with malicious results, where the text results pages are beginning to be left behind, Bing has been unofficially dubbed (currently) the most poisonous search engine. The only reason that the moniker has been attached to the search service is due to the recent report about malicious websites being targeted at image searches now as opposed to the text results pages. Not to fret however, as Bing and Google will take steps to close those holes which have been opened in the image results, and in the meantime just be a little more cautious before clicking that top image of your favorite star.
Google is getting into the credit business for the first time, with the launch on Monday of a programme in the UK to finance purchases of its online advertising by businesses.
The move marks the opening of a new front in the battle between the biggest internet companies, as they turn to their balance sheets as a source of competitive advantage. Amazon said last week that it had begun making loans to independent sellers that offer their products on its marketplace, marking the online retailer’s first move into financial services.
Google’s decision to issue its own credit card, which will also be made available in the US within weeks and other unspecified countries later, signals the company’s first attempt to use its huge cash reserves to support its core search advertising business by subsidising low-interest rate credit lines.
It said it would offer customers credit of between $200 and $100,000 a month to pay for their use of Adwords, which places messages next to the results in its search engine and made up the bulk of its $37bn in advertising revenues last year.
Read full story here
So they worked together until the 2011 fallout when twitter supplied real time results to Google, now twitter is getting more into SEO, who said it was dead?
After changing it’s robot.txt file some weeks back, twitter has now let the search engines, Google, Bing and others checkout there user profile directory, basically a sitemap of all the users, this of course will help people find the accounts they’re looking for with various search engines.
According to reports Google has indexed 718,000 matching results, Bing with it’s renowned slower bot has only got the directory home page at present but will surely get the others sometime soon.
So SEO still lives on, well in the eyes of social media sites anyway.
Came across this great infographic by Aaron Wall at SEOBook, what I found most interesting was the deluded people mentioned, here at Fresh Traffic we have been coming across people like this from day one of the internet. The truth of the matter is that most people who say or mention this are on the list, why? simple they cannot do it.
Click to enlarge to pdf version
It’s a honor to welcome another Brit to Winnipeg, Sir Richard Branson arrived to champion a pilot program to put cellular phones in the hands of homeless and at-risk youth, called RAY. Branson unveiled “Phones for Change” Thursday through Virgin Mobile Canada and Virgin Unite, two subsidiaries of the Virgin Group, an international giant that he founded.
When we talk about Branding here at Fresh Traffic here is company that has done it better than most, Richard born in Blackheath, London, His first business venture was a magazine called Student at the age of 16.In 1970, he set up an audio-record mail-order business. In 1972, he opened a chain of record stores, Virgin Records, later known as Virgin Megastores. Branson’s Virgin brand grew rapidly during the 1980s, as he set up Virgin Atlantic Airways and expanded the Virgin Records music label.
I have had the privilege of working for one of the virgin brands online and had a few nights out with Richard in the past, a true gentleman with no heirs and a straight talking intelligent man.
So Rich, welcome to Winnipeg, have a great few days with us and no doubt we will run in to each other very soon.
Richards Winnipeg Chamber Video
There’s merit to the web search team and the tidbits of news they put out everyday. Sometimes they talk about the changes to the algorithm, and about how to expect a shift in the search rankings or placements.
It’s not uncommon for sweeping changes to be made to the web which leaves old code irrelevant, if every little bit was always left as part of the formula, then the search algorithms wouldn’t be the few hundred factors they are now, it would be in the thousands. The search industry would be substantially slower moving, both in use and as a business. Building a website would be an absolute nightmare if you still had to worry about table construction for a half dozen different browsers, and if the flash menu you’ve built with fancy fly away modules would even start up in others. Thankfully, none of that is an issue, as old coding techniques get replaced with much more up to date methods.
One of the basics of web development, that we’ve always made sure to mention to web developers and clients, is to make sure they utilize the metas keywords and description tag. For a number of years, the search engine optimization industry has said that the tags no longer bear any relevance or importance, and that the engines themselves will more often than not choose which they like to use anyways. Putting time and effort into even writing the few lines of code it takes to put them in, is a few lines too many. Fair enough, everyone is entitled to their own opinion, I do offer an example of what can happen when you let the engines decide what to publish as your websites description in search.
SEARCH. <div> <div > </div> <div> <div templateType=”C1″> <ul role=”listbox” class=”"> <li role=”menuitem”> headline </li>
Just a couple of points to note, this is from a multi-million dollar company, with strong rankings in search and in traditional media, and this is the description that was pulled to use in the results page. This mix up won’t hurt their positioning online, or offline, but it proves that skipping the basics isn’t always a good idea. Skipping adding a description tag on the premise that it’s an out dated step, would be the end for a new site online with that type of description pulled.
Previously I mentioned how Zuckerberg was being tapped for information about a possible Facebook search engine, and while he admitted to serving something on the order of a billion searches per day, he wouldn’t admit to there being an engine on the way. It seems this week, that there is still a smattering of discussion still trying to discerne if there is a social search engine on the horizon.
There is a strong belief in the social circle, that the search industry is headed towards a results page which is served based on your friends list. There has been the odd study every few months which seem about as skewed as the recent BingItOn (citing 2:1 Bing over Google acceptance) challenge which are overwhelmingly in favor of a social engine as opposed to a search engine. It’s only my personal opinion, but I think Google and Bing are already headed in the right direciton, with having some social signals being included into the organic and paid results. If someone were to build an engine served entirely by social signals, the only one in the space who could feasibly pull it off would obviously be Facebook, however their reasons for being tight lipped about the possibility of one launching is likely due to the studies being not entirely as accurate as the media would have you believe. As always, time will tell, and change is a very good thing.
Jabber, Jabber, Jabber
There is always a positive side, and a negative side to having someone else complete work for you. On the positive, you are paying a professional to complete a job at a much higher skill level than you could. A negative however, stems from the fact that you are time constrained suddenly, not by your own, but by the professional you have hired.
There is no real way to speed up the process of properly completing a project once you have turned it over to someone else. Just like you bring a car to a mechanic to fix your engine, you bring it to an autobody business to have it repainted. You wouldn’t ask a mechanic to paint it, it’s not their job nor are they proficient at it. In the same line of thinking, when it’s required for your SEO to work through your tech department, mandatory steps can end up taking a day or two, and (we have had ths happen) sometimes up to two weeks. This is an immense amount of time to lose on any campaign, whether it’s a single day or 10 days, it’s time lost in search. It’s akin to bringing your car in for a paint job, but having all of the instructions relayed through the mechanic, to the autobody technician.
Communication is an incredibly important step in the optimization process, and it works in all directions. Just as it’s important for a client to relay desired keyword position, it’s important that as SEO professionals we regularly communicate with you to keep you apprised of how you are performing. When the communication stops, that’s when the problems begin.
An odd job that every site owner should make sure to take the time to do for themselves, is to always be on the lookout for new, or soon to be opportunities. The way these opportunities can manifest themselves for your business vary, you could see increased foot traffic at physical stores, or if you have a strictly online position that your conversion rate on your site has improved.
Be aware of any of the changes in your city and target demographics. Whether your aim is to make sales, or to have people sign up for a news letter, your first hurdle to get over is the one of visibility. If you can’t be found for the newest gizmo in your niche, then you may as well not even sell them. Google and Bing both have a handy tool which you can use to get a feel for how your niche tracks. Bings Keyword Tool allows you to discover search trends over the last few months, and allows you to see how trends which initially move slowly, can suddenly fly up the rankings.
The tool that Google has available is their Trends Tool and it’s a different version of the Bing one. Trends has recently been merged with one of their other tools, Insights for Search, and has become a research gem. You can conduct research on as small an area as a few weeks, or as far back as 2004, you can get readings from a much larger base and it paints a larger picture for you to make decisions on.
Historical tools and data are amazing in helping you make clearer decisions where your market is possibly heading, especially now that we’re heading into the Christmas shopping season. As an example, in 2010 in the US alone there was $30 billion spent in online shopping, and in 2011 that number rose to $35 billion. You don’t need a large sample picture to guess, that number is only going to increase again. This is where Freshtraffic comes in, research your niche, decide on your target and come to us as soon as possible. We’ll help you reach the commanding position you need to take a bite of the (possibly) $40 billion+ online shopping pie.