Tagged with " bing"
So finally the election is finished, and the winner has been decided. If for some reason you’ve been living in a cave the last couple of days, Obama took the crown and is set to begin his second term as the President of the United States. And regardless of who you were rooting for, there were some interesting search discoveries over the last couple of months of the battle, which have their roots in search.
A few days back, there was a story run in the Wall Street Journal about how Google was serving up results pages in what some were thinking was a strange coincidence. It seemed that even with being signed out of a Google account, and being on a cookie free browser, the results when searching for Obama almost bcame personalized. The article that was published even went on to say that the search engine was biased when searching for obama and related news, with one story coming right out and saying that the candidates were being treated unfairly. While it would make for a great conspiracy story, the unexciting truth is that it’s just how the Google algo works. Google simply displayed results based on how people searched for terms, the example being
more people searched for “Obama” followed by searches for “Iran” than the number of people who searched for “Romney” followed by “Iran.”
That was the first interesting point, the second follows in a similar vein.
It’s not really news anymore that between the candidates there were hundreds of millions of dollars spent on campaigning, but it was interesting to find that Obama out bid Romney on search ads online at nearly three to one. Both were bidding on the big hitters like ”2012 election” and “2012 presidential polls” to lead people to their campaign websites, but it was the former President who owned the paid advertisements of the results pages. Sticking in the trend of online visibility, Obama had Romney beat across the board with more Facebook fans, website visitors and Youtube video views.
The largest demographic in the voting populace is shifting to a much younger, information hungry crowd, so being able to be found online should be an integral cog in any parties agenda. When you shake all the numbers out from organic results to paid search, it looks like in the end Obama simply out optimized his opponent, and as helped secure himself with a second term.
There’s been a number of changes in the search world over the past 15 years since it’s pseudo birth, but the changes that have happened in the last 12 months have been some of the largest ever. There have been the Panda updates, the Penguin changes, and the EMD (exact match domain) changes that have made search engine optimization a much more interesting job. And not that they’re the only search engine in the game, but leave it to Google to make the most news with any change, seeing as they own the vast majority of the market.
I’ve outlined what can happen when you make a mistake and breach one of the rules set forth by the engines. You can take a rankings hit, you could suffer a penalty in the form of maybe losing some (Google) Page Rank, or you could even be completely removed from the index if you’ve accumulated enough ‘strikes’ against your website or url. As search engine optimization experts it is our job to ensure that ourselves, nor our clients fall into any of the multitude of pitfalls which you can find yourself in. None of these scenarios are unrecoverable, although making sure to get back into the good graces of the search engines will take some time and an extensive SEO skillset.
However if you don’t have time, or any search engine optimization skills under your belt, or maybe you don’t have the budget to bring in the real search experts, there is a solution for your business. It is one which will still take time, but you don’t have to worry so much about the SEO skills initially, because you’re going to start down the road of rebranding. Completely rebuilding your brand image is really a last resort option to take for your business, as it can take almost a year to return to the search results pages. If you’ve found yourself far enough up the creek that rebranding is a more viable option than repairing the mistakes you’ve made, perhaps it’s time for an evaluation of your job description.
The numbers for the past month in search came out, and while seeing Google on the top for the majority share, what was somewhat disheartening was the continuing slide of the Yahoo position.
There’s been no real shift in the overall numbers, Google is still sitting at just over 2/3 of the search share, and Bing is following up with just 15% share. Yahoo slipped even further than the previous months, to just around 12%, giving the combined search engine just shy of 30% of the market. Yahoo was one of the primary search engines and one of the first to roll out a paid search marketing platform so to see them slip further out of the limelight of relevancy. Change however, is inevitable and is always a good thing for all parties involved.
The search share numbers aren’t terribly surprising in the grand scheme of things, and perhaps it was the additons of Panda and Penguin to the equation, but the number crunchers are at it again. On the webmaster forums there is discussion going on what the current algorithm may contain and how it might use analytics to help rank the sites in the index.
Some interesting theories are coming out of the discussion, mainly because no one outside of Google really knows the process for ranking the sites within the index. Google has mentioned previously that they don’t use any search data from their Chrome browser, and the running theory so far is the idea that the search giant is using click data from ISPs. In the end it’s only the team at Google who really knows how the engine ranks it’s results.
There’s been a small surge of malware reports coming from the searches via Bing and Google, which really isn’t news in and of itself as they’re always buried within the results somewhere. But what is different, is that more than 90% of image results were found to be malware related on some terms.
The most targeted term this time around happens to be “Emma Watson”, whom McAfee has named their most dangerous celebrity search of 2012. Of the two engines, 30% of Googles searches had malware warnings attached and more than double that came out of Bings results. Malware take overs happen in a couple of different ways, one of the most frequent are websites built with little to no security written in, and then there are throw away websites and urls used purely for the spread of malware on the web.
Black hat SEOs typically go after the hottest search terms and poke around the web looking for websites which have loop holes in it’s security. They actively work to hijack the website and it’s url, to help lend false authority to what ever term they’re wanting to spam. And because uneducated or hasty users tend to automatically trust the top results in the search engines, the spread of malware will continue.
Because of the recent discoveries that image results are getting slammed with malicious results, where the text results pages are beginning to be left behind, Bing has been unofficially dubbed (currently) the most poisonous search engine. The only reason that the moniker has been attached to the search service is due to the recent report about malicious websites being targeted at image searches now as opposed to the text results pages. Not to fret however, as Bing and Google will take steps to close those holes which have been opened in the image results, and in the meantime just be a little more cautious before clicking that top image of your favorite star.
There’s merit to the web search team and the tidbits of news they put out everyday. Sometimes they talk about the changes to the algorithm, and about how to expect a shift in the search rankings or placements.
It’s not uncommon for sweeping changes to be made to the web which leaves old code irrelevant, if every little bit was always left as part of the formula, then the search algorithms wouldn’t be the few hundred factors they are now, it would be in the thousands. The search industry would be substantially slower moving, both in use and as a business. Building a website would be an absolute nightmare if you still had to worry about table construction for a half dozen different browsers, and if the flash menu you’ve built with fancy fly away modules would even start up in others. Thankfully, none of that is an issue, as old coding techniques get replaced with much more up to date methods.
One of the basics of web development, that we’ve always made sure to mention to web developers and clients, is to make sure they utilize the metas keywords and description tag. For a number of years, the search engine optimization industry has said that the tags no longer bear any relevance or importance, and that the engines themselves will more often than not choose which they like to use anyways. Putting time and effort into even writing the few lines of code it takes to put them in, is a few lines too many. Fair enough, everyone is entitled to their own opinion, I do offer an example of what can happen when you let the engines decide what to publish as your websites description in search.
SEARCH. <div> <div > </div> <div> <div templateType=”C1″> <ul role=”listbox” class=””> <li role=”menuitem”> headline </li>
Just a couple of points to note, this is from a multi-million dollar company, with strong rankings in search and in traditional media, and this is the description that was pulled to use in the results page. This mix up won’t hurt their positioning online, or offline, but it proves that skipping the basics isn’t always a good idea. Skipping adding a description tag on the premise that it’s an out dated step, would be the end for a new site online with that type of description pulled.
It’s been long clear that a search engine is a search engine, and there are a handful online that receive the majority of the traffic out there. Google, Bing and Yahoo are the usual methods which are used to search, with other sites like Blekko, Duckduckgo, and Ask also being used by those desiring a different experience. Facebook however, is one of the largest websites online, and with what is approaching a billion users, the world has been waiting to see if Facebook is going to try and enter the search arena.
Ever since Facebook has gone public, the stock has been a sort of tepid pool, with no real revenue model the online mutterings often go to the topic of a search engine. And when Mark Zuckerberg throws around statements akin to “Facebook is doing a billion searches per day without trying” the mutterings pick up some volume. In a recent Techcrunch interview, Zuck made it clear that the company realizes that there is a huge opportunity for a search engine with Facebook, but tempers that by also talking about how the way that users search is ever evolving.
“Search engines are evolving” to “giving you a set of answers,” Zuckerberg said. ”Facebook is pretty uniquely positioned to answer a lot of questions people have. At some point we’ll do it,” he went on. “We have a team working on search.”
There are some real concerns about just how Facebook could leverage their massive user base as a search engine however, and it has less to do with spidering capabilities than it does privacy. The system as he briefly described and envisioned, would mean taking the opinions of your friends, family and contacts and trying to form a result for your query. Searching for terms like ‘best burger’ or ‘new batman movie reviews’ wouldn’t necessarily be informational, but would deliver you a list of opinions from your contact list. At any rate, it’s not happening today or tomorrow, or even soon for that matter. But (if) when it does, it will be introducing change into the search landscape and the online experience as a whole, and change is very, very good.
Google is the most widely used search engine globally, and accounts for 65% +/- usage in North America usage. Bing is the rebrand of the Live search engine service, and it’s sitting in a maintain position as of late at around 30% +/- share of the search market place.
Bing has long contended that they have a comparable search service, and some in the search world share their sentiments. But even with their rebrand, television commercials, and with taking over the Yahoo search market, their share remains at a steady third or so of the market. Dubbed as the Bing it On test, it’s a blind survey test which display unformatted, unbranded results and the user decides which results they would use of the two. It’s a testing method that is also known as the Pepsi test, where random people were given a sample of two drinks, and asked which they prefer.
When Bing had tallied the results of their (very small) online sample of 1000 people, they found that the users chose the Bing results at almost a 2:1 ratio. That’s a rather large statistical difference from the current norm with Google dominating the search market share, so why wouldn’t the numbers be the same for current market share? Well for starters the sample number is incredibly small. Using a data sample from 1000 people in the 18+ demographic is like a drop in the ocean, with there being somewhere north of 200 million people in the US alone. If you’re interested in which search engine appeals to you as a user, you can try out their survey for yourself here.
Previousl I wrote about a Google patent which has gained more and more traction in the search community. The patent in question is named Ranking documents, and as mentioned previously it seemed like an interesting tactic to employ to catch thsoe who less than ideal tactics to rank a website.
Unfortunately, often times you’ll find that people are discussing search engine optimization, and will tack on the terms spamming, or buying links, but the truth of it is – when you do it right SEO is none of those things. The new patent that was discovered I think of like a magic trick, it’s a bit of slight of hand that Google is using to get those unscrupulous tacticians to reveal themselves. Google shows them the index, they try and spam to get ranked highly. When Google notices the spam, they show a different set of results (if you don’t think they can do that, you’re not thinking clearly) just to see the reaction of their target. If they continue to offend, their site will likely be penalized, and the ranking pages will haven’t been affected negatively.
Spamming links to a website to give uncredible weight to a term is a measure used by those in the industry who are in the game for only a few quick clicks. Often used on disposable website urls, they will target a hot, trending term – as an example Paul Ryan’s RNC speech – and will try and garner as many links and as much visibility as possible in as short a time frame as they can. It’s a pump and dump tactic that is very likely to have the offending site banned from the index and the url blacklisted.
I fore see that this patent will be dissected, discussed, scrutinized and blamed for all manner of SEO problems and head aches to come. The only problem that some site owners have has with Google, Bing and Yahoo is they’re getting better and better at catching the people who try to cut corners or use less than natural methods to rank their site. Maybe even one day soon, we’ll have an adaptive algorithm which detects, and removes spam sites as you search actively so black hatters go the way of the dodo bird.
There have been adjustments, changes, and what seems like complete rewrites of the algorithm that Google started with in the beginning. At first when you searched, the results you were given were based directly upon the query as you’d entered it, and sorted by how many back links it had. Now however, when you search for ‘Winnipeg Jets’, images and video for the team appear even though the words “images” or “videos” weren’t in the query.
The algorithm that Google, Bing and a handful of others use, has grown and evolved to a point where it’s trying to anticipate what you’re searching for, as well as the direct query you may have typed. The search engines are getting better at bringing what you want to see on your given topic, and seem to be weighing the number of clicks through to a result as well as all of the previous criteria as well. As you’ve probably mistakenly typed a word or two while searching as well, you’ve probably noticed that search engines are also able to correct spelling mistakes which are commonly made. What the engines are getting much better at doing however, is not correcting your spelling, but interpreting what you may actually be searching for. Google can load dictionaries of how words should be spelled and common misspelled variations of those words and can look at how searchers correct searches and when they click on different variations. And it can use this data to not only suggest a query with a different spelling but to treat the misspelling as a synonym behind the scenes and rank the correctly spelled matches.
And as with what goes in the same basket as interpreting what you might be looking for, Google is noticeably moving forward on trying to discern your intent while searching as well. A basic description of how it works:
Keyphrases don’t have to be in their original form. We do a lot of synonyms work so that we can find good pages that don’t happen to use the same words as the user typed. – Matt Cutts
Even Bing is getting in on the act as if you mispell a common word or phrase, you’ll still often be brough to the correct results. Perhaps soon enough, you won’t need to search by typing, but by simply visiting your preferred search engine. And because you researched a new car and purchased one via online dealership shopping, the engine knows that in a few months it should perhaps deliver you information on local garages which offer oil changes and tire rotation services.
Google has been king of the search world, from almost the day it became a tool on the web. There are a handful of other search engines as well, all which do their best to offer a choice when you’re looking for online information.
There’s been some discontent with Google as a service, and it sometimes leaves users craving an alternative to the giant. There is a small problem with that idea however, and it’s the same reason that makes Google so successful. When you consider the basics of search, if you have quality content that people easily link to, you’re going to be well represented on the search engines, Google, Bing, etc. Google has just worked out how to best deliver the most likely content you’re searching for, because it can work out the content and the links leading back to that content.
It doesn’t mean though, that the web and search is due to stagnation. Everyone is working to innovate on the space, trying to find the newest, and biggest evolution in search and online interaction. There are some out there that do their best to be an answer engine, where you can basically query a database for an answer, and there are others out there which pride themselves on being hand curated by teams of human users to help promote the most relevant content. Bing and Google are both trying their hands at integrating your social life into your search results, both with mixed success at the present. But what all of these search engines become stuck and stuble upon, is the same issue, all of the current relevant results are built primarily upon links and link structures to help give value and authority to the website.
The future of search, won’t lie in constructing links back to your quality content, it will be when someone is able to come up with a search engine which can predict what it is you may be searching for. When you’re able to start looking for a new home for sale in a new city for example, and based upon your current, and previous searches it can determine that you’re in need of a new home near a school for your children, and it delivers those results to you as the most relevant. The technology doesn’t quite exist in such a way at the moment, as it would require massive amounts of calculations to hold the web open, ready to pick out the points you’re searching for. But the web and it’s technology do grow everyday, and perhaps soon enough we’ll be able to talk to our devices to find what we want.