• LinkedIn
  • Fresh Traffic on Google Plus
  • Subcribe to Our RSS Feed
Browsing "algorithms"

Searching in the Future – The Web Squared

Jan 6, 2011   //   by freshtraffic   //   algorithms, business, ecommerce, facebook, Google, internet, local search, media technology, New Media, search marketing, twitter, web development  //  Comments Off

With the rapid advancement of the web, the technologies that control it and the methods that people interact with it, it makes me wonder sometimes what’s going to happen by 2020.

*cue time warp*

Your morning might be something like while getting ready for work, you’re receiving all of your local newsfeeds directly to your 3D/Holo television already sorted and delivered relevant to your interests. News snippets, weather announcements followed by sports results all fully controllable should you desire more information. The commute to work, in a hands free car navigating itself to your meetings. No one works in offices anymore, the instant web and cloud offices makes physical locations a throwback to the previous centuries way of doing business.

With cloud computing being fully integrated into mainstream business, social and common use, communication has never been simpler, or faster. Terabit internet in the sprawling cities ensures that there’s always enough bandwidth. And for those with pockets full of money, neural interactivity direct to a focusing lens you wear like glasses; providing a vast, interactive surface with which to work and play.

Online search, commerce and social activities will most likely be completely merged; think of a mega company the likes of a Google and Facebook merger. We’ll call it GoogleBook. A complete portal, with news, social feeds from friends and family, shopping via search and instant messaging for friends, family and clients. Micro-blogging sites like Twitter, would be absorbed and added to the already potent offerings provided by such a massive company. The idea of privacy online has matured and changed with the baby boomer generation gone offline to relax in peace, and the tech savvy information generation coming into it’s prime as the dominant work force population.

The web will be faster, cleaner and more relevant to each individual as the Google algorithm, Facebook social algorithm, and the Amazon shopping algorithm all become written together into a do it all super algorithm. With signing in online, it will deliver the content you’re interested in, show you what your friends have been doing the last few days and find the local best deals for the new television you were thinking of buying.

*end time warp*

It’s going to be an exciting time to be online, even in the next few years let alone in the next 10. The web and it’s technologies are growing at an exponential rate, what we’ve learned and discovered over the last 25 years online, will be doubled in the next 3-4 years; and then that time will be cut again and again. Until discoveries are coming at such a rate, that it’ll be expected to have new tech every week, instead of every couple of months.

You could also subscibe to the theory that it’s game over in December 2012 as well. No one knos what’s to come in the next few days, let alone years. Here’s hoping the web continues to grow, mature and evolve as quickly as it has been.

Google Upgrades from 2010

Dec 23, 2010   //   by freshtraffic   //   algorithms, Google, internet news  //  Comments Off

During the past year, Google has made their mistakes along the way as discussed previously. They also however, made a number of upgrades and changes to the way the world searches. They’re always tweaking and changing the game, and it can play havoc with the SERPs and your clients rankings.

The largest and most dramatic of such changes took on the form of the ‘Mayday’ update. It was a fundamental algorithmic change, and affected a great many sites which focused primarily on long tail searches, most of which catalogue sites with hundreds if ‘item’ pages with little to no links or content within. While a lot of sites cried foul, it was really a culling of the SERPs and removed a great deal of fluff from the results pages. Soon after came the Caffiene upgrade to the algorithm, adding speed to the search results. The largest shift in the search game since the Mayday update, it served up a cached version of the search performed and allowed users to reach their destination a tad faster than previous visits.

The next two largest changes that were brought to the search game were Google instant, and previews. Google Instant served up instant search results, as a user typed the terms into the search box. It essentially allows the caffeine update to serve up results for searches. The Pageviews update added a small magnifying glass to the search results, and while that may not seem significant, it served up results in a small frame on the search performed. The frame detailed on the page, in a small screenshot, where the search phrase was located, further speeding up the search experience for users.

As with the mistakes made during the past year by Google, there have been a number of upgrades to the engine as well. 2011 is just a few more days away, and who knows, another Mayday may be on it’s way.

New Google Results

Nov 26, 2010   //   by freshtraffic   //   algorithms, fresh traffic, Google, internet strategy, optimized, search engine optimization, SEO Expert, website optimization  //  Comments Off

It’s not a new trend that Google is always tweaking the way their search works, or how the results are displayed. And they’ve made another change live, which drives home the importance of being certain your website is optimized.

Normally when you conduct a search on Google, the results you’ll receive are a mix of the most relevant websites which contain your terms. Occasionally, when it was applicable, the same website may show up twice, if the algorithm deemed it applied. This recent change however, can return up to 4 results from the same website with shorter descriptive text in the results. The remaining results on the page will be the other varied relevant websites. Now why should this concern someone working on climbing the SERPs? Well, when you perform a search, the results page you’ll see contains the top 10 results. And if 4 of those top 10 are from the same website, odds are you may not have even made page one. The change is recent with Google, and if you’re ranking 5 and less on page one of their results, if your not diligent with your SEO you may soon find yourself on page two!

Page one of the Google SERPs is a difficult nut to crack, and with this recent change it’s added another level of finesse which has to be used as your site is optimized. When interested small business owners reach out to the SEO experts, it’s often they balk a little when they learn what the actual cost is to hire a professional; but if you think a pro is expensive, try hiring an amateur. Search engine optimization is the best return on investment in the online market place, so what is it worth to you to be found.

How to Blekko? To Blek?

Nov 1, 2010   //   by freshtraffic   //   algorithms, Google, internet news  //  Comments Off

The term “Google it” has grown in such wide spread popularity it’s not a strange phrase to hear anymore on a day to day basis. It may be in an ad, a conversation with a colleague or something you catch in a story online. The search engine and company, Google, has grown in such a way it’s actually a verb. Not too bad for a string of letters and numbers whose conceptual purpose was to bring order to chaos of sorts.

There’s been a host of contenders in the search space, Live search now Bing, Yahoo which is now powered by Bing, as well as some who fell by the wayside and were either forgotten, or scooped up to be made part of a larger whole. Fast, Jellyfish.com, Powerset, Cuil and Wolfram Alpha. But none have been able to reliably hold a card to the power that Google commands of the web. Bing has it’s share and niche, and there are users who are comfortable with the new and sometimes bothersome Facebook integration; who still needs to learn the difference between opt-in and opt-out.

The newest kid on the block, is Blekko.com, whose choice to serve the the search audience with vertical, more specific searches, as opposed to horizontal encompassing search results. Founded by Rich Skrenta, Blekko aims to improve search results by leveraging the supposed wisdom of the crowd.

“We realized we could make Web tools that let users sign up and help make the search engine better. If we opened up the process, we could not only get orders of magnitude more people involved than we could ever hope to employ, we could also create an open, accountable process around the search engine relevance data.”

In short, by signing up you can create what Blekko calls slashtags, which work similar to Twitters hashtags to categorize tweets. The end result being search results which ignore irrelevant results or farm content.

The downside noticed with the engine so far, is that the key which makes it unique (the slashtags), also allows it to be manipulated. Blekko seeks to create highly relevant results with this method based on human user input as to what is deemed important. The big problem with people of course, is we’re fallible. What’s important on a topic to me and you, isn’t neccisarily important to anyone else on the same subject. It’s an interesting search engine, it delivers hazy results at present, but it does sport the word “beta” on it’s page, so what the tech will shape up to be is yet to be seen.

Time to be Found – Google Instant

Sep 8, 2010   //   by freshtraffic   //   algorithms, Google, internet marketing, internet news, internet strategy, online branding, online marketing, seo  //  Comments Off

So the big news so far this week would most definitely have to go to the newest change in Google search, Google Instant.

Google Instant is starting to roll-out to users on Google domains in the US, UK, France, Germany, Italy, Spain and Russia who use the following browsers: Chrome v5/6, Firefox v3, Safari v5 for Mac and Internet Explorer v8.

In a nutshell, Google is completetly your searches for you as you type, so no need to hit that enter button. They broke down the search time in a basic format. It takes about 9 seconds to type a query, 1 second to return results, and on average, 15 seconds to select your best choice. The idea, is that Instant will reduce search times by as much as 5 seconds!

Unless your a professional racer of some sorts, 5 seconds may not seem like a lot, but it can mean the difference between an ad impression, click through, or new visitor to your site. As a business owner, you need to decide and realize what your time, product and online presence are worth to you. Google Instant isn’t available widely, and can be shut off by users who dislike the service. But, what are you worth? Is your competitors site optimized better? Is your nearest rival perhaps in a better search position on the “normal” SERPs? With the looming introduction of permanent Instant, how much is your online brand worth to you? A difference of 5 seconds could mean the difference between a new contract, or being on the receiving end of a dusty unused website.

Dynamic Diverse Search

Aug 23, 2010   //   by freshtraffic   //   algorithms, Google, internet news, search engine optimization, seo, seo strategies  //  Comments Off

There’s a new Google test which has caught fire on the web discussions over the weekend. Google has been running a test algorithm segment which displays your search as you type. Dynamically updating the page as you add, change, or remove your query.

The assumption is that the test has been rolled out to those with only a very high speed connection, as the nature of the results being delivered is unknown. It may be from a cached, prefetch server based on your previous searches, but it also may be entirely and completely dynamic in nature. Automatically fetching the results as you add a term.
Couple this recent test, in with the article decrying that Google is set to allow domain dominance on a search page, it will change the landscape of the SEO game somewhat.

Google said:
Today we’ve launched a change to our ranking algorithm that will make it much easier for users to find a large number of results from a single site. For queries that indicate a strong user interest in a particular domain, like [exhibitions at amnh], we’ll now show more results from the relevant site:Prior to today’s change, only two results from www.amnh.org would have appeared for this query. Now, we determine that the user is likely interested in the Museum of Natural History’s website, so seven results from the amnh.org domain appear. Since the user is looking for exhibitions at the museum, it’s far more likely that they’ll find what they’re looking for, faster. The last few results for this query are from other sites, preserving some diversity in the results.

This change could mean the difference in small business SEO, and will definitely encourage niche marketing campaigns. So it’s time to put on your creative thinking caps, hash out the creative copyright for your clients, and be ready to push for the niche search terms.
From a PR perspective though, it’s an interesting twist from Norvig’s comment earlier about wanting more diversity in search results. In having the second result to be as “different” from the first as possible to encourage diversity.

Free Press vs Free ‘Net

Jul 19, 2010   //   by freshtraffic   //   algorithms, Google, internet news, newspaper, NY Times  //  Comments Off

Last week a piece was written in the New York Times, which suggested heavily that Google and it’s algorithm needs to be taken in hand, and monitored. Using examples like financial incentives, handling 60%+ of the web queries worldwide and how Google can break small business owners with a shift in ranking; having the government decide what Google can, and can’t change within the algorithm was pressed. Make the algo public, let the government decide what tweaks can or can’t be made, and to determine in the end, what’s relevant for users.

Needless to say, it wasn’t taken too lightly. Danny Sullivan wrote an entertaining response, using the verbage from the article nearly word for word, replacign Wall Street Journal for Google. It’s an entertaining article to read, I suggest taking the time. One of the more enjoyable points for me, he compares in the end, the WSJ to Google, and at one point even has a comparison of the business’ bias in transparency concerns.

Google will list EVERY site that applies for “coverage” unlike the New York Times, which regularly ignores potential stories

If Google blocks a site for violating its guidelines, it alerts many of them. The New York Times alerts no one

Google provides an entire Google Webmaster Central area with tools and tips to encourage people to show up better in Google; the New York Times offers nothing even remotely similar

Google constantly speaks at search marketing and other events to answer questions about how they list sites and how to improve coverage; I’m pretty sure the New York Times devotes far less effort in this area

Google is constantly giving interviews about its algorithm, along with providing regular videos about its process or blogging about important changes, such as when site speed was introduced as a factor earlier this year.

June 2007, Google allowed New York Times reporter Saul Hansell into one of its search quality meetings, where some of the core foundations of the algorithms are discussed.

Who’s article rings of more truth to you?

What are you pointing at?

Jul 15, 2010   //   by freshtraffic   //   algorithms, Google, internet news, seo, seo strategies  //  Comments Off

Well, under a new patent which was approved this past week, Google will have an idea just what you do like to point at. The patent, titled “System and method for modulating search relevancy using pointer activity monitoring” was filed in 2005, and granted this week. The patent is described as a system for monitoring the movements of a user controlled mouse pointer in a web browser, identifying when the pointer movies into a predefined region and when it moves out of said region.

So basically you can think of it as using a hotbox link area for an image, or a div tag in CSS. Google can assign an area for analysis on their SERPs page, and track where searchers mouse moves. What type of information, and how it could possibly be applied in the realm of SEO is still to be determined. It could however, give Google a better understanding as to how well a SERPs page comprised of blended (both paid and organic results) results fares.

And, in the realm of satire, these headers are from a live website, who noticed Google flagged their website as a possible spam site. Internet cookies for those who can see what’s wrong. Who said Google doesn’t read meta tags?

<meta name=”author” content=”” />

<meta name=”alexa” content=”100″></meta>

<meta name=”googlebot” content=”noodp” />

<meta name=”pagerank™” content=”10″></meta>

<meta name=”revisit” content=”2 days”></meta>

<meta name=”revisit-after” content=”2 days”></meta>

<meta name=”robots” content=”all, index, follow”></meta>

<meta name=”distribution” content=”global” />

<meta name=”rating” content=”general” />

<meta name=”resourse-type” content=”documents” />

<meta name=”serps” content=”1, 2, 3, 10, 11, ATF”></meta>

<meta name=”relevance” content=”high” />

Google vs the EU?

Jul 9, 2010   //   by freshtraffic   //   algorithms, Google, internet news  //  Comments Off

Being the big dog on the playground, it’s inevitable that you’ll step on some toes. It appears that in the most recent sense, Google has stepped on the European Union’s toes.

From AP:

The European Union’s antitrust chief said Wednesday he is looking “very carefully” at allegations that Google Inc. unfairly demotes rivals’ sites in search results.

Using language such as: “importance of search to a competitive online marketplace.” Almunia accepted the argument from Google that with it’s size online, and far reaching strength, it’s difficult to behave at times in a dynamic market as the internet. With a store front active 24/7/365, when a company has worked to place itself at the top of the game, sometimes the little guys can be knocked about unknowingly.

The inquiry was launched however, due to Googles recent aquisition of the travel network ITA, an online booking agency. Two EU based comparison sites complained to the union that they were ranked lower in the SERPs, because the are competitors. And seeing as how higher ranking leads to higher search volumes, the EU may have a case. The algorithms are all programmed, with no human interaction within the SERPs, so in the end, the bottom may fall out of the case.

New Possible Criteria to be Integrated in Search Engine Algorithms

Dec 3, 2009   //   by freshtraffic   //   algorithms, Google, seo, seo techniques  //  Comments Off

The first criteria that are possibly going to be added are the amount of Click data. It has been noted that Google news does not use links as a ranking factor rather it uses clicks or the amount of clicks as a ranking factor.

Second Criteria is web referencing, a web reference is not a link but rather a mention of the target website in an external website. This means that the mere mention of your website from another website which you do not own is already considered as a ranking factor, not just by links anymore.

A third criterion is the presence of your website in social media networks. Social media is now considered to be the latest craze among different walks of life and the popularity of your website in a social media community will contribute greatly to your ranking.

Finally, page loading time is also going to become a factor nowadays that is why website design and web development will play an important role in the field of SEO. The rationale behind using page loading time as a ranking factor is the fact that fast page loading time improves user experience.

With these new criterias added would it help revolutionise the search engine algorithm?, we might also see another shift in terms of the SEO approaches being done by different SEO companies and SEO consultants

Pages:«12