Category Archives: algorithms

Mobile Search Advancement

With how compact, powerful and convenient todays smartphones are becoming. The rise of the netbook and tablet pc, it’s not a surprise at all that mobile search, search using the aforementioned techonologies, is growing in leaps and bounds. A very general breakdown of Googles numbers were posted in their blog this past week:

Over the past two years, Google’s mobile searches have grown by more than five times. Furthermore, in the third quarter of 2010, Google mobile searches jumped 130% year over year.

Percentages are amazing to look at and all, but they should also be taken with a some thought; they can make the actual results seem much larger than they are. But onto mobile search! Google, like Bing and other search companies, have their own keyword search tool. They have however, recently added the ability to check which terms are being used in mobile searches.

The Keyword Tool now helps you build a better keyword list to target mobile users. Under “Advanced options,” you can now search for keywords for devices with mobile WAP browsers, mobile devices with full Internet browsers (think iPhone and Android phones), or all mobile devices.

So your site, which by now is hopefully mobile friendly (it is 2011 after all), can be optimized with the mobile market in mind. With the billions of dollars in revenue this past holiday season which were made via mobile techonology, it’s well worth the investment.

The Greatest Guessing Game

What is the greatest guessing game you ask? It’s the game which has made Google, Microsoft and Yahoo, as well as other search engine start ups and even failures, piles of money just by mention of the word. Search, is the greatest guessing game.

What happened when Google took the game and applied it’s own rules, was dominate the online community as it propelled itself forward, clawing and fighting for all of the infomation it could find. There are various illustrations of the web which come to mind when it’s pictured. Firstly as a web of course, of interconnecting websites and pages, all of which the search bots, spiders naturally, navigate their way around and build up this interconnectability between them. I’ve seen pictures of the internet visualized as planets in galaxies and solar systems, as continents on a map and even as a DNA strand at one point. The best visualization I can come up with is that of an ocean, and all of the websites and pages of the internet are just kind of floating around. People are like little fish, darting around from point to point, sometimes finding what they want, sometimes not. But it’s a fluid environment, never the same from day to day and always on the move.

An article written about which search engine is better at delivering relevant results was the inspiration for today. It tried to demonstrate that by using identical results in different search engines, that one could clearly deliver better and more relevant results than the other. The reality is I believe, much murkier than that. Google is absolutely a brand name, and used extensively in all walks of life. Bing is working hard on branding itself as a decision engine and not a search engine, but in the end both algorithms do primarily the same thing. They guess at what you’re looking for, they guess that they’re delivering you what you want to see and they guess mostly correct only because you’ve already told them what you want to see. Whether it’s via your search history, cookies saved on your computer or even your directly typed search query. Search is still just a game, and for now Google still plays it best. The internet and online technology being what it is, we’ll revisit the topic in a year and everything may be upside down.

Searching in the Future – The Web Squared

With the rapid advancement of the web, the technologies that control it and the methods that people interact with it, it makes me wonder sometimes what’s going to happen by 2020.

*cue time warp*

Your morning might be something like while getting ready for work, you’re receiving all of your local newsfeeds directly to your 3D/Holo television already sorted and delivered relevant to your interests. News snippets, weather announcements followed by sports results all fully controllable should you desire more information. The commute to work, in a hands free car navigating itself to your meetings. No one works in offices anymore, the instant web and cloud offices makes physical locations a throwback to the previous centuries way of doing business.

With cloud computing being fully integrated into mainstream business, social and common use, communication has never been simpler, or faster. Terabit internet in the sprawling cities ensures that there’s always enough bandwidth. And for those with pockets full of money, neural interactivity direct to a focusing lens you wear like glasses; providing a vast, interactive surface with which to work and play.

Online search, commerce and social activities will most likely be completely merged; think of a mega company the likes of a Google and Facebook merger. We’ll call it GoogleBook. A complete portal, with news, social feeds from friends and family, shopping via search and instant messaging for friends, family and clients. Micro-blogging sites like Twitter, would be absorbed and added to the already potent offerings provided by such a massive company. The idea of privacy online has matured and changed with the baby boomer generation gone offline to relax in peace, and the tech savvy information generation coming into it’s prime as the dominant work force population.

The web will be faster, cleaner and more relevant to each individual as the Google algorithm, Facebook social algorithm, and the Amazon shopping algorithm all become written together into a do it all super algorithm. With signing in online, it will deliver the content you’re interested in, show you what your friends have been doing the last few days and find the local best deals for the new television you were thinking of buying.

*end time warp*

It’s going to be an exciting time to be online, even in the next few years let alone in the next 10. The web and it’s technologies are growing at an exponential rate, what we’ve learned and discovered over the last 25 years online, will be doubled in the next 3-4 years; and then that time will be cut again and again. Until discoveries are coming at such a rate, that it’ll be expected to have new tech every week, instead of every couple of months.

You could also subscibe to the theory that it’s game over in December 2012 as well. No one knos what’s to come in the next few days, let alone years. Here’s hoping the web continues to grow, mature and evolve as quickly as it has been.

Google Upgrades from 2010

During the past year, Google has made their mistakes along the way as discussed previously. They also however, made a number of upgrades and changes to the way the world searches. They’re always tweaking and changing the game, and it can play havoc with the SERPs and your clients rankings.

The largest and most dramatic of such changes took on the form of the ‘Mayday’ update. It was a fundamental algorithmic change, and affected a great many sites which focused primarily on long tail searches, most of which catalogue sites with hundreds if ‘item’ pages with little to no links or content within. While a lot of sites cried foul, it was really a culling of the SERPs and removed a great deal of fluff from the results pages. Soon after came the Caffiene upgrade to the algorithm, adding speed to the search results. The largest shift in the search game since the Mayday update, it served up a cached version of the search performed and allowed users to reach their destination a tad faster than previous visits.

The next two largest changes that were brought to the search game were Google instant, and previews. Google Instant served up instant search results, as a user typed the terms into the search box. It essentially allows the caffeine update to serve up results for searches. The Pageviews update added a small magnifying glass to the search results, and while that may not seem significant, it served up results in a small frame on the search performed. The frame detailed on the page, in a small screenshot, where the search phrase was located, further speeding up the search experience for users.

As with the mistakes made during the past year by Google, there have been a number of upgrades to the engine as well. 2011 is just a few more days away, and who knows, another Mayday may be on it’s way.

New Google Results

It’s not a new trend that Google is always tweaking the way their search works, or how the results are displayed. And they’ve made another change live, which drives home the importance of being certain your website is optimized.

Normally when you conduct a search on Google, the results you’ll receive are a mix of the most relevant websites which contain your terms. Occasionally, when it was applicable, the same website may show up twice, if the algorithm deemed it applied. This recent change however, can return up to 4 results from the same website with shorter descriptive text in the results. The remaining results on the page will be the other varied relevant websites. Now why should this concern someone working on climbing the SERPs? Well, when you perform a search, the results page you’ll see contains the top 10 results. And if 4 of those top 10 are from the same website, odds are you may not have even made page one. The change is recent with Google, and if you’re ranking 5 and less on page one of their results, if your not diligent with your SEO you may soon find yourself on page two!

Page one of the Google SERPs is a difficult nut to crack, and with this recent change it’s added another level of finesse which has to be used as your site is optimized. When interested small business owners reach out to the SEO experts, it’s often they balk a little when they learn what the actual cost is to hire a professional; but if you think a pro is expensive, try hiring an amateur. Search engine optimization is the best return on investment in the online market place, so what is it worth to you to be found.

How to Blekko? To Blek?

The term “Google it” has grown in such wide spread popularity it’s not a strange phrase to hear anymore on a day to day basis. It may be in an ad, a conversation with a colleague or something you catch in a story online. The search engine and company, Google, has grown in such a way it’s actually a verb. Not too bad for a string of letters and numbers whose conceptual purpose was to bring order to chaos of sorts.

There’s been a host of contenders in the search space, Live search now Bing, Yahoo which is now powered by Bing, as well as some who fell by the wayside and were either forgotten, or scooped up to be made part of a larger whole. Fast, Jellyfish.com, Powerset, Cuil and Wolfram Alpha. But none have been able to reliably hold a card to the power that Google commands of the web. Bing has it’s share and niche, and there are users who are comfortable with the new and sometimes bothersome Facebook integration; who still needs to learn the difference between opt-in and opt-out.

The newest kid on the block, is Blekko.com, whose choice to serve the the search audience with vertical, more specific searches, as opposed to horizontal encompassing search results. Founded by Rich Skrenta, Blekko aims to improve search results by leveraging the supposed wisdom of the crowd.

“We realized we could make Web tools that let users sign up and help make the search engine better. If we opened up the process, we could not only get orders of magnitude more people involved than we could ever hope to employ, we could also create an open, accountable process around the search engine relevance data.”

In short, by signing up you can create what Blekko calls slashtags, which work similar to Twitters hashtags to categorize tweets. The end result being search results which ignore irrelevant results or farm content.

The downside noticed with the engine so far, is that the key which makes it unique (the slashtags), also allows it to be manipulated. Blekko seeks to create highly relevant results with this method based on human user input as to what is deemed important. The big problem with people of course, is we’re fallible. What’s important on a topic to me and you, isn’t neccisarily important to anyone else on the same subject. It’s an interesting search engine, it delivers hazy results at present, but it does sport the word “beta” on it’s page, so what the tech will shape up to be is yet to be seen.

Time to be Found – Google Instant

So the big news so far this week would most definitely have to go to the newest change in Google search, Google Instant.

Google Instant is starting to roll-out to users on Google domains in the US, UK, France, Germany, Italy, Spain and Russia who use the following browsers: Chrome v5/6, Firefox v3, Safari v5 for Mac and Internet Explorer v8.

In a nutshell, Google is completetly your searches for you as you type, so no need to hit that enter button. They broke down the search time in a basic format. It takes about 9 seconds to type a query, 1 second to return results, and on average, 15 seconds to select your best choice. The idea, is that Instant will reduce search times by as much as 5 seconds!

Unless your a professional racer of some sorts, 5 seconds may not seem like a lot, but it can mean the difference between an ad impression, click through, or new visitor to your site. As a business owner, you need to decide and realize what your time, product and online presence are worth to you. Google Instant isn’t available widely, and can be shut off by users who dislike the service. But, what are you worth? Is your competitors site optimized better? Is your nearest rival perhaps in a better search position on the “normal” SERPs? With the looming introduction of permanent Instant, how much is your online brand worth to you? A difference of 5 seconds could mean the difference between a new contract, or being on the receiving end of a dusty unused website.

Dynamic Diverse Search

There’s a new Google test which has caught fire on the web discussions over the weekend. Google has been running a test algorithm segment which displays your search as you type. Dynamically updating the page as you add, change, or remove your query.

The assumption is that the test has been rolled out to those with only a very high speed connection, as the nature of the results being delivered is unknown. It may be from a cached, prefetch server based on your previous searches, but it also may be entirely and completely dynamic in nature. Automatically fetching the results as you add a term.
Couple this recent test, in with the article decrying that Google is set to allow domain dominance on a search page, it will change the landscape of the SEO game somewhat.

Google said:
Today we’ve launched a change to our ranking algorithm that will make it much easier for users to find a large number of results from a single site. For queries that indicate a strong user interest in a particular domain, like [exhibitions at amnh], we’ll now show more results from the relevant site:Prior to today’s change, only two results from www.amnh.org would have appeared for this query. Now, we determine that the user is likely interested in the Museum of Natural History’s website, so seven results from the amnh.org domain appear. Since the user is looking for exhibitions at the museum, it’s far more likely that they’ll find what they’re looking for, faster. The last few results for this query are from other sites, preserving some diversity in the results.

This change could mean the difference in small business SEO, and will definitely encourage niche marketing campaigns. So it’s time to put on your creative thinking caps, hash out the creative copyright for your clients, and be ready to push for the niche search terms.
From a PR perspective though, it’s an interesting twist from Norvig’s comment earlier about wanting more diversity in search results. In having the second result to be as “different” from the first as possible to encourage diversity.

Free Press vs Free ‘Net

Last week a piece was written in the New York Times, which suggested heavily that Google and it’s algorithm needs to be taken in hand, and monitored. Using examples like financial incentives, handling 60%+ of the web queries worldwide and how Google can break small business owners with a shift in ranking; having the government decide what Google can, and can’t change within the algorithm was pressed. Make the algo public, let the government decide what tweaks can or can’t be made, and to determine in the end, what’s relevant for users.

Needless to say, it wasn’t taken too lightly. Danny Sullivan wrote an entertaining response, using the verbage from the article nearly word for word, replacign Wall Street Journal for Google. It’s an entertaining article to read, I suggest taking the time. One of the more enjoyable points for me, he compares in the end, the WSJ to Google, and at one point even has a comparison of the business’ bias in transparency concerns.

Google will list EVERY site that applies for “coverage” unlike the New York Times, which regularly ignores potential stories

If Google blocks a site for violating its guidelines, it alerts many of them. The New York Times alerts no one

Google provides an entire Google Webmaster Central area with tools and tips to encourage people to show up better in Google; the New York Times offers nothing even remotely similar

Google constantly speaks at search marketing and other events to answer questions about how they list sites and how to improve coverage; I’m pretty sure the New York Times devotes far less effort in this area

Google is constantly giving interviews about its algorithm, along with providing regular videos about its process or blogging about important changes, such as when site speed was introduced as a factor earlier this year.

June 2007, Google allowed New York Times reporter Saul Hansell into one of its search quality meetings, where some of the core foundations of the algorithms are discussed.

Who’s article rings of more truth to you?

What are you pointing at?

Well, under a new patent which was approved this past week, Google will have an idea just what you do like to point at. The patent, titled “System and method for modulating search relevancy using pointer activity monitoring” was filed in 2005, and granted this week. The patent is described as a system for monitoring the movements of a user controlled mouse pointer in a web browser, identifying when the pointer movies into a predefined region and when it moves out of said region.

So basically you can think of it as using a hotbox link area for an image, or a div tag in CSS. Google can assign an area for analysis on their SERPs page, and track where searchers mouse moves. What type of information, and how it could possibly be applied in the realm of SEO is still to be determined. It could however, give Google a better understanding as to how well a SERPs page comprised of blended (both paid and organic results) results fares.

And, in the realm of satire, these headers are from a live website, who noticed Google flagged their website as a possible spam site. Internet cookies for those who can see what’s wrong. Who said Google doesn’t read meta tags?

<meta name=”author” content=”” />

<meta name=”alexa” content=”100″></meta>

<meta name=”googlebot” content=”noodp” />

<meta name=”pagerank™” content=”10″></meta>

<meta name=”revisit” content=”2 days”></meta>

<meta name=”revisit-after” content=”2 days”></meta>

<meta name=”robots” content=”all, index, follow”></meta>

<meta name=”distribution” content=”global” />

<meta name=”rating” content=”general” />

<meta name=”resourse-type” content=”documents” />

<meta name=”serps” content=”1, 2, 3, 10, 11, ATF”></meta>

<meta name=”relevance” content=”high” />