Tag Archives: google

Canada’s Growing Online

Canada is still really coming into it’s own online, and perhaps Google has come up with the right idea to encourage some growth. The Google eTown award is designed to recognize those towns where small businesses are investing in online tools and resources to find new customers, grow their business, and improve their operations.

Chris O’Neill, Managing Director of Google Canada

“We’re delighted to recognize the accomplishments of the small businesses embracing the web in each of these cities,” said, we know that the Internet is going to contribute massively to Canada’s economic growth..”

We’ve been saying for a few years here at Fresh, that online business is still coming into it’s own for all us Canucks, hopefully this new incentive will help spur things along. They’re not playing any favorites, and have divided the country up into 5 zones, Atlantic, Quebec, Ontario, Prairies, and BC & North. This year the 5 winners are: Moncton New Brunswick, Dorval Québec, Parry Sound Ontario, Canmore Alberta, and Duncan British Columbia.

Across each of the winning towns, business has discovered that the Internet became a vital tool for reaching new customers and engaging with existing ones. In recognition of their eTown status, Google Canada will be presenting the awards to each city during events throughout October. Local businesses will be invited to attend and meet Google experts who will be on hand to provide advice on growing your business online. Actively working on your brand image is one of the key initiatives that you need to pursue while growing online. Being able to translate your offline business into your online image is one of the goals you should be aiming for, and when you’re ready for that step, Freshtraffic is here to help.

Updates All Around & A Blast From the Past

There’s been a couple of Google updates in the last week, which has had a visible change on the results pages. One of the updates was fairly simply, it was a Panda update which according to the available information impacted less than 1% of searches. And the other update was in how the results page returns your search results.

The Panda update was another in a long string of updates designed to improve the overall search quality, and with the change affecting such a small sample if you’re following the rules then you shouldn’t perceive any issue. If you’re the numbers type of person, you can compare the less than 1% affected from this last update, to the 12% affected when Panda made it’s initial break out onto the web. One of the points to bear in mind as well with this news coming straight from Google, there may be some surprises in store under the hood of this latest update.

Maybe the extra they put into the Panda update has something to do with the tweet from Matt Cutts earlier this week?

Just fyi, we rolled out a small algo change this week that improves the diversity of search results in terms of different domains returned.

Diversity in the search results is a great growing point, as it gives more opportunity for optimization to show it’s mettle. It’s possible, to have an entire first page of results completely dominated by a single website, if your optimization is that strong and your competitors are that weak. While the change was added only this week, reports are that it hasn’t yet been noticed in the bulk of the results.

And it’s a bit of a throwback to the beginnings of the web, but according to a recent posting, the keywords tag has returned. According to Google News Product Manager Rudy Galfi:

The goal is simple: empower news writers to express their stories freely while helping Google News to properly understand and classify that content so that it’s discoverable by our wide audience of users.
Similar in spirit to the plain keywords metatag, the news_keywords metatag lets publishers specify a collection of terms that apply to a news article. These words don’t need to appear anywhere within the headline or body text.

A small difference in the way the news_keywords tag works, is you’re limited to 10 terms, likely to stop users from cramming as many terms as they like. It’s a new way to use an old tag, and it’s going to be interesting to see how the use plays out.

Facebook Search Engine on the Horizon?

It’s been long clear that a search engine is a search engine, and there are a handful online that receive the majority of the traffic out there. Google, Bing and Yahoo are the usual methods which are used to search, with other sites like Blekko, Duckduckgo, and Ask also being used by those desiring a different experience. Facebook however, is one of the largest websites online, and with what is approaching a billion users, the world has been waiting to see if Facebook is going to try and enter the search arena.

Ever since Facebook has gone public, the stock has been a sort of tepid pool, with no real revenue model the online mutterings often go to the topic of a search engine. And when Mark Zuckerberg throws around statements akin to “Facebook is doing a billion searches per day without trying” the mutterings pick up some volume. In a recent Techcrunch interview, Zuck made it clear that the company realizes that there is a huge opportunity for a search engine with Facebook, but tempers that by also talking about how the way that users search is ever evolving.

“Search engines are evolving” to “giving you a set of answers,” Zuckerberg said. ”Facebook is pretty uniquely positioned to answer a lot of questions people have. At some point we’ll do it,” he went on. “We have a team working on search.”

There are some real concerns about just how Facebook could leverage their massive user base as a search engine however, and it has less to do with spidering capabilities than it does privacy. The system as he briefly described and envisioned, would mean taking the opinions of your friends, family and contacts and trying to form a result for your query. Searching for terms like ‘best burger’ or ‘new batman movie reviews’ wouldn’t necessarily be informational, but would deliver you a list of opinions from your contact list. At any rate, it’s not happening today or tomorrow, or even soon for that matter. But (if) when it does, it will be introducing change into the search landscape and the online experience as a whole, and change is very, very good.

Google-fu and GoDaddy Gets Disrupted

A search engine seems like a rather simple tool on the surface. Whether it’s Bing, Yahoo or Google as your flavour, when you arrive on their page you’re greeted with an empty text box and a search button. It’s not difficult to work out that with a few keystrokes you can be off and searching for what ever you fancy. But for as simple as it seems, it can be used as a much more powerful piece of equipment, if you take the time to learn what you can do to leverage it’s power in your favour.

Google has been stepping in this year with offering a class on how to use their search engine to discover what you’re looking for. It’s a free, online course whose goal is really just to improve the user experience, both for you and for them. From one of their Google blogs: “The community-based course features six 50-minute classes along with interactive activities and the opportunity to hear from search experts and Googlers about how search works.”

One of the Google representatives likened it to always driving a vehicle in first gear, it will definitely get you where you’re going, but there’s always a faster, better way to do it. The Google search engine offers a whole host of methods to construct a query, whether you want to do basic calculations, learn a time zone or zip code. Even if you’re having trouble describing what it is you’re searching for, you can even drag the image directly into the search bar and conduct your search that way. Add into the couple of tips here all of the conditional arguments you can use ( “” + or – etc) and the possibilities begin to multiply. I guess the question is, how’s your Google-fu?

Yesterday afternoon, for around 4-5 hours GoDaddy suffered an issue with their DNS servers. Some reports have it tied to a hacking attempt which resulted in complete shutdown of (projected) hundreds of thousands of websites. Only GoDaddy really knows how large the affected number is, but where you did lose traffic yesterday, you can rest easy on your rankings and position; at least with Google. John Mueller had this to say about the outage: “In this case (the service outage), assuming it was just the DNS that wouldn’t resolve, we would treat it similar to how we treat other temporary crawling issues, and just retry at some later point.”

Google on Fire?

There’s been an interesting tidbit of news from the last couple of days surrounding Google, although it’s not in any search decisions or changes they have made.

One of the notions which is being put forward is that Google is in the position of being wary of Amazon and it’s recently announced Kindle Fire. The new tablet that has been rolled out operates on the Android operating system, which they’ve tailored to their own use. Normally this doesn’t make a whole lot of news, there are a handful of tablets on the market of varying quality and some use Android, but the reason the Fire has made a ruckus is because of how it’s been tailored.

Normally when a company builds Android to their use, the default search engine that is used is Google. Amazon, likely because of their business model, chose to forgo using Google as their search provider and instead when you search on the Fire you’re directed to an Amazon search results page. As a user, this would make sense, since you’ve bought an Amazon product, if you’re searching for, or curious about a product, that you’d receive an Amazon page. The reason this is a big deal for Google, is when people are conducting a search to purchase or research a product, it’s a qualified user, so they’re able to serve ads on the results page. The ads that appear on product search pages are worth more to Google, so they make more money. With Fire users being directed to an Amazon results page, they’re skipping that money generator entirely, a complete loss for Google.

What hasn’t been truly answered is what happens if you were to do an informational search, say for some local movie theaters or restaurant bookings. Amazon isn’t so much a local retailer as a global one, so it’ll be interesting to see what gets delivered.

Is Bing Better than Google? You Decide

Google is the most widely used search engine globally, and accounts for 65% +/- usage in North America usage. Bing is the rebrand of the Live search engine service, and it’s sitting in a maintain position as of late at around 30% +/- share of the search market place.

Bing has long contended that they have a comparable search service, and some in the search world share their sentiments. But even with their rebrand, television commercials, and with taking over the Yahoo search market, their share remains at a steady third or so of the market. Dubbed as the Bing it On test, it’s a blind survey test which display unformatted, unbranded results and the user decides which results they would use of the two. It’s a testing method that is also known as the Pepsi test, where random people were given a sample of two drinks, and asked which they prefer.

When Bing had tallied the results of their (very small) online sample of 1000 people, they found that the users chose the Bing results at almost a 2:1 ratio. That’s a rather large statistical difference from the current norm with Google dominating the search market share, so why wouldn’t the numbers be the same for current market share? Well for starters the sample number is incredibly small. Using a data sample from 1000 people in the 18+ demographic is like a drop in the ocean, with there being somewhere north of 200 million people in the US alone. If you’re interested in which search engine appeals to you as a user, you can try out their survey for yourself here.

More Google Patent Thoughts

Previousl I wrote about a Google patent which has gained more and more traction in the search community. The patent in question is named Ranking documents, and as mentioned previously it seemed like an interesting tactic to employ to catch thsoe who less than ideal tactics to rank a website.

Unfortunately, often times you’ll find that people are discussing search engine optimization, and will tack on the terms spamming, or buying links, but the truth of it is – when you do it right SEO is none of those things. The new patent that was discovered I think of like a magic trick, it’s a bit of slight of hand that Google is using to get those unscrupulous tacticians to reveal themselves. Google shows them the index, they try and spam to get ranked highly. When Google notices the spam, they show a different set of results (if you don’t think they can do that, you’re not thinking clearly) just to see the reaction of their target. If they continue to offend, their site will likely be penalized, and the ranking pages will haven’t been affected negatively.

Spamming links to a website to give uncredible weight to a term is a measure used by those in the industry who are in the game for only a few quick clicks. Often used on disposable website urls, they will target a hot, trending term – as an example Paul Ryan’s RNC speech – and will try and garner as many links and as much visibility as possible in as short a time frame as they can. It’s a pump and dump tactic that is very likely to have the offending site banned from the index and the url blacklisted.

I fore see that this patent will be dissected, discussed, scrutinized and blamed for all manner of SEO problems and head aches to come. The only problem that some site owners have has with Google, Bing and Yahoo is they’re getting better and better at catching the people who try to cut corners or use less than natural methods to rank their site. Maybe even one day soon, we’ll have an adaptive algorithm which detects, and removes spam sites as you search actively so black hatters go the way of the dodo bird.

Algorithm Updated and it’s getting smarter

There have been adjustments, changes, and what seems like complete rewrites of the algorithm that Google started with in the beginning. At first when you searched, the results you were given were based directly upon the query as you’d entered it, and sorted by how many back links it had. Now however, when you search for ‘Winnipeg Jets’, images and video for the team appear even though the words “images” or “videos” weren’t in the query.

The algorithm that Google, Bing and a handful of others use, has grown and evolved to a point where it’s trying to anticipate what you’re searching for, as well as the direct query you may have typed. The search engines are getting better at bringing what you want to see on your given topic, and seem to be weighing the number of clicks through to a result as well as all of the previous criteria as well. As you’ve probably mistakenly typed a word or two while searching as well, you’ve probably noticed that search engines are also able to correct spelling mistakes which are commonly made. What the engines are getting much better at doing however, is not correcting your spelling, but interpreting what you may actually be searching for. Google can load dictionaries of how words should be spelled and common misspelled variations of those words and can look at how searchers correct searches and when they click on different variations. And it can use this data to not only suggest a query with a different spelling but to treat the misspelling as a synonym behind the scenes and rank the correctly spelled matches.

And as with what goes in the same basket as interpreting what you might be looking for, Google is noticeably moving forward on trying to discern your intent while searching as well. A basic description of how it works:

Keyphrases don’t have to be in their original form. We do a lot of synonyms work so that we can find good pages that don’t happen to use the same words as the user typed. – Matt Cutts

Even Bing is getting in on the act as if you mispell a common word or phrase, you’ll still often be brough to the correct results. Perhaps soon enough, you won’t need to search by typing, but by simply visiting your preferred search engine. And because you researched a new car and purchased one via online dealership shopping, the engine knows that in a few months it should perhaps deliver you information on local garages which offer oil changes and tire rotation services.

Happy Birthday to Fresh and New Google Patents Affecting the Index

An interesting patent was being discussed lately, which may be related to all of the most recent shifts in Googles index, which seem to shake up the rankings pages every few weeks.

A basic description of how the patent reads:

Rather than allow the rankings to respond immediately and directly to those changes, the system that would change rankings in unexpected, counter-intuitive ways – while the rankings change from a first position through transition positions and to the final “target rank” position.

It sounds a little strange to think that the algorithm would work in an almost backwards way, but it’s an interesting idea. It’s a way for Google to monitor situations where spam may be an issue in a websites position, whether it be through links or content. and as the last portion of the analysis describes,

significant changes in position continue to happen even though there is no change in page’s ranking factors

Happy Birthday to Fresh

It’s been five, eventful years now since Fresh has made it’s landing in Winnipeg, and we’ve steadily become the online leader in the city. We’ve seen the construction of a new museum and stadium, the return of the long lost Jets, and a steadily growing business sector.

In terms of online change and growth, the internet has brought us nearly a billion Facebook users, the introduction of an alternative with Google+ and many changes and updates to the way people access information on the web. The growth of online search, and businesses using their websites to capture their audience has been increasingly growing in Winnipeg. While it was a little bit slow in the beginning to show locals the way, we’re surely and steadily getting there with more and more businesses and people coming online each day. It’s been a great beginning Winnipeg, and we look forward to many more to come.

Penguin Incoming?

If you’ve felt a little over run in the last little while by the Google zoo which has been running over the index, it’s a tad sorry to report but we’re not quite out of the wild yet.

Google has mentioned that while the Penguin algorithm shift is targeted at removing/reducing the spam sites in the results, they have also let it be known that it’s still actively being adjusted and worked on. The Penguin update is acting as an adjustment (their word) to the results in removing the backlink value that spammy sites could pass on to those looking at making a quick buck or shift in their positioning. As for Panda, that was the content upgrade in case you’ve forgotten, it is still tuned for digging out poor content on sites and pages. And although the updates are coming more consistently with them being automated, the shock, and surprise that website owners were initially experiencing from positioning drops, have lessened.

Panda is a regular monthly addition now to the search index, and Penguin is being incorporated in much the same way, however it still has a ways to go. With the addition soon to your webmaster tools to handle unnatural links pointing to your website, there will undoubtedly still be some site owners experiencing a shift in rankings if being inattentive with their site. Over time however, it will be a part of the regular indexing, and the results will be cleaner for it.
Matt Cutts on the growth needed of Penguin:

If you remember, in the early days of Panda, it took several months for us to iterate on the algorithm, and the Panda impact tended to be somewhat larger (e.g. the April 2011 update incorporated new signals like sites that users block). Later on, the Panda updates had less impact over time as we stabilized the signals/algorithm and Panda moved closer to near-monthly updates. Likewise, we’re still in the early stages of Penguin where the engineers are incorporating new signals and iterating to improve the algorithm. Because of that, expect that the next few Penguin updates will take longer, incorporate additional signals, and as a result will have more noticeable impact.