Tag Archives: search

Is Bing Better than Google? You Decide

Google is the most widely used search engine globally, and accounts for 65% +/- usage in North America usage. Bing is the rebrand of the Live search engine service, and it’s sitting in a maintain position as of late at around 30% +/- share of the search market place.

Bing has long contended that they have a comparable search service, and some in the search world share their sentiments. But even with their rebrand, television commercials, and with taking over the Yahoo search market, their share remains at a steady third or so of the market. Dubbed as the Bing it On test, it’s a blind survey test which display unformatted, unbranded results and the user decides which results they would use of the two. It’s a testing method that is also known as the Pepsi test, where random people were given a sample of two drinks, and asked which they prefer.

When Bing had tallied the results of their (very small) online sample of 1000 people, they found that the users chose the Bing results at almost a 2:1 ratio. That’s a rather large statistical difference from the current norm with Google dominating the search market share, so why wouldn’t the numbers be the same for current market share? Well for starters the sample number is incredibly small. Using a data sample from 1000 people in the 18+ demographic is like a drop in the ocean, with there being somewhere north of 200 million people in the US alone. If you’re interested in which search engine appeals to you as a user, you can try out their survey for yourself here.

Looming Holidays and Online Budgeting

It’s crunch time for budgets, we’re coming to a point where you need to decide: do we spend more and hope for big returns over the holidays? Or do we cut back and hope we can carry through to a new year? It’s not a question anyone can really answer for you, as everyones situation is unique, I would however, like to make a case for our niche – online branding, or internet marketing if you prefer.

Where online is concerned, consistency matters most, in your content, your presentation of yourself, of how you manage your online image and branding. And while it’s true for almost all facets of life, it’s even more apparent where your online spending is concerned; you get what you pay for. If you pay for shoddy service and workmanship, that’s exactly what you’ll receive. Where online and the search engines are concerned though, you will likely end up being penalized, and at worst, banned from the results pages until you clean up your site and links. Now some people try and sell that organic optimization only takes a one time shot and is a very low cost to work ratio, when in fact it’s pretty much the exact opposite that’s true.

As an example, if you try and save some money where your online marketing is concerned and decide to out source to one of the cheap advertisers out there, odds are very high that you’re going to end up with poor, if not irrelevant back links, and this will get your site snagged up in the newest algorithm addition – Penguin. The addition of Penguin into the algorithm means that the link spammers of the past, are very quickly finding themselves with burned urls – aka websites which have become useless to link to, or with. It takes a great deal of time to work out a relevant, and acceptable linking strategy for any companies website, and to skimp on this portion of your marketing will be one of the nails in your websites coffin.

Often times when you’re having your website built for you, or if you’re having it redesigned, you’ll find that web developers can run into the hundreds and thousands of dollars. If you have a complex site, with a shopping cart, perhaps streaming videos and audio with a user login system, you’ll be possibly even looking into the tens of thousands in cost. Thinking about saving money on your website and it’s construction? Rethink that cost saving measure, as cutting costs from the way your site functions, looks and performs can not only get you flagged by the search engines as having errors, but it can leave your site poorly secured and the possibility of being hijacked increases greatly. And if your website becomes hijacked by a malicious user who uses it to spam (best case scenario), or completely trash your website and use it to spread malware (black listed from the SERPs) not only will the search engines not trust you, your end users and customers will begin to distrust as well.

This was only two, of the plethora of issues which can sink your website should you choose to skimp on your online budget, and while being removed from the SERPs is a terrible possibility, the interaction lost with your current and possible customers should be seen as the real loss. Lost traffic due to reduced resources leads to lost conversion rates over all, which is just a never ending cycle of less and less.

Is Linking Dead?

With the way things have been progressing online, it’s not much of a stretch to think that some of the old ways have gone to the wayside for search engines. Google, Bing, Yahoo and all of the others out there need to choose a metric of sorts where by it allows them to determine what is relevant to certain topics and categories. The long running, and highest contributing factor since the beginning, has been linking to websites; both those which are relevant to your business and those which may help your positioning. It’s a very simple formula really, site A compiles a great deal of information about cogs and becomes known across the country as the top producer and information source for them. Site B, is a reseller of site A, and as such provides a link directly too site A, helping cement their positioning online as the top purveyor of the cog industry. That’s a very basic example, multiply that a trillion times and you’re beginning to see the beginnings of the internet, and how linking works to sort out the web.

Over the last couple of years especially though, the social web has made a big splash. Facebook, Twitter, LinkedIn, Pinterest, Google+, all social sites which are being hooked by the spiders and are being more frequently plugged into your search results. Some companies out there like to make the correlation that the larger the web has gotten, the more community focused people have become. Instead of searching out for the best cog company, people are asking their friends on Facebook and Twitter for example. It was a shock that ran through the SEO industry to think that the value of links was going to be tossed away, all because people suddenly had the ability (not that they didn’t already) to ask their friends for their opinion. It’s a (non)issue that continues being blown out of proportion by the unseasoned search experts out there.

Bet a very simple truth is this, the weight that linking and back linking to websites sin’t going to go away. Not just yet, and not in the near fore seeable future at any rate. That’s not to say that social signals and social linking isn’t going to become the heavy weights at some point, but that point is not today, not next month, and not next year.

Getting Re-Indexed and Search Dominance

We’ve been over the steps of what you need to do when you’ve been penalized and dropped from the index, but once you’ve followed all of those steps, you might be wondering just what’s next? To recap quickly what you should do, first go over your email (which you most undoubtedly have) and follow their major points of issue. If it’s bad backlinks, do your best to have the removed. Spammy content? Get a handle on it and rewrite it. Found out your SEO is playing the black hat game of gaming the engines instead of working with them? It’s time to drop them and call the real experts in search. After all of those steps, you resubmit your site for inclusion.

But once you’ve done all of that, it’s in the hands of the search gods. It’s where you need to sit on your hands, and wait for them to decide if you’ve done enough, to be reindexed and included back into the search rankings. What some people don’t realize though, is sometimes the search engines don’t fully clean your record, it may only be a partial pardon, incentive really to clean the rest of your act up. Just like search engine optimization isn’t just a black and white industry, neither is directing traffic at Google or Bing.

So, just how relevant is too relevant? It’s a question being asked lately as more and more often, the results page tends to be over taken by the same website. There was a short video put out by Matt Cutts and the Google team, trying to describe just what’s going on.

The method for displaying these newer results however, have been getting under users skin however. How diverse do the search results really look, or seem, when the top three or four, and sometimes the entire page, is taken up with a single result? Relevance to the search query is obviously which drives Google and other search engines to deliver their results, and the better refined they the better it is for the end user. Have you had any instances recently where the search results page has been dominated by a single result?

SEO Terms and you

Since we covered the very basics of how web developers, designers, business owners and SEOs could work together a little better yesterday, lets get into a tad more detail. Taking it a little slower, we’ll just discuss a handful of some of the terms you’re going to run into when working with a search engine optimization firm.

Once we’ve had the chance to take a good hard look at your website, one of the first few things you’ll find us talking about is about conducting keyword research. Basically all this means to you as a website owner, is we need to know what terms you’re interested in ranking with, and we’ll break down your content to see if those keywords exist in a workable combination. It’s also a step taken when we search for your current listings, and breaking down how you stack up versus your competitors. It’s a simple step, one which gets abused at times unfortunately when some believe that spamming their keyword as many times as possible is a good thing. Also tieing into your website and it’s current performance, is Page Rank. It’s actually not as huge a metric as it once was, but it’s a ranking system created by Google’s Larry Page which gives your site a number based on a number of factors. Authority of incoming links, the quality of your content and website, and this rank is passed on through out your site. It used to seem that the higher your page rank, the higher you sit in the SERPs, but Google hasn’t been as diligent in up keeping their system, with Panda and Penguin being introduced in the last couple of years.

Once we’ve determined what you want to rank for, how you stack up currently in your niche market and where to focus our efforts, you’re going to start hearing terms like geo-targeting, and click through rate a whole lot. Geo-targeting is the process of constructing your website and it’s pages, to be specifically relevant to certain areas. You can easily work in city geo-targeting into your site with adjustments to content, and you can even drill down into neighborhoods if you begin to use tools like AdWords etc. With targeting your website, you ensure that you’re working at capturing your target market, and increasing your over all click through rate. Click through rate, loosely defined, is the percentage of searchers who click on your link after performing a search. It’s a great metric to keep measurement of, as it can fairly quickly outline for you if a new campaign, or advertising strategy has had a positive or negative effect on your brand and business.

The Goliath Complex of SEO

The goals of SEO are relatively simple, to make your site rank as highly as possible within the search pages for your niche. Whether you build houses, write stories, or draw pictures, search engine optimization is applicable for any website online. What a lot of smaller business owners can also use SEO for, is to knock the big players down a peg or two.

It’s an important step for all parties to consider SEO as a great equalizer online, you do however have to remember to stay within the rules. There are billions of web pages online, and yet with that daunting number in mind it’s still a relatively simple process to stay within the sights of the search engines. All you really need to keep in mind are the basics, even just following the best practices guidelines gives your website a shot at being picked up and indexed. But you need to also remember, the internet isn’t exactly a friendly place yet, a great deal of the web is free and wild. As a small example, you can’t control what websites choose to link to you if they choose too. This can be a difficult hurdle to overcome as well, as irrelevant, or inappropriate back links leading to your website can seriously hamper any SEO efforts you may have in place. This is only a single element of what’s known as negative SEO.

The larger, more established and authoritative sites such as Amazon are somewhat safer in this regards, however no one is completely immune to negative SEO. Negative search engine optimization can be defined as spammy links, blatant keyword stuffing, duplicate content or anything that isn’t considered white hat SEO by the search engines. Smaller, newer sites unfortunately are more susceptible to negative optimization problems. In the beginnings of a sites growth, it may not have much content or links pointing to it. If you’re not careful with how you craft your content or structure your links and navigation, you may even get dinged as having duplicate or irrelevant content in your niche. The number one point however that you need to keep in the forefront of your mind though, because the internet is still wildly untamed, the playing field is actually relatively plain and simple. Follow the rules, manage your website and monitor your content to make sure it doesn’t get scraped or that it has been copied from another resource. Even the big hitters can be taken down online, no target is too big or too relevant on the web.

Google’s Penguin Attacks

So the large update that Google pushed out late last week, which has a name you can now curse – Penguin, has had it’s share of folks caught in the crossfire and been down ranked. In case you were wondering what the update was about, the short version of the update is it was targetted at directly reducing webspam, and sites which use “aggressive spam” tactics.

As always, Matt Cutts came out on his white horse maintaining that so long as you create quality, original content, and stick to the best practices, that you should be alright with this new update. What has been discovered over the weekend however, and something that site owners couldn’t entirely be prepared for, was the effect that would be felt by targetting spammy sites. While as a site owner and web admin you can control what content is contained within your site, you unfortunately, have very little control as to who, or what, links to your site.

Larger online brands have felt little change at the moment with the update, but that doesn’t help any of the smaller sites out on the web. While Google mentioned that only 3% of the search results would be effected, it seems as the week gets underway that number will be a tad higher. The notable sites which have been cropping up in discussions tend to be smaller e-stores which are using shared, or affiliate information. In an affiliate layout, already not one of search engines favourites, if any one site in the chain adopts bad practices, then the down ranking factors will eventually get to your site as well.

Amid all of the uproar of sites being downed in the rankings or even in some cases, completely lost, there have been some valid suggestions. One of the most basic, and most helpful would likely be that instead of Google hurting anyone for being linked in a bad chain, simply remove any ranking or relevancy of the original, infringing domain. At least then that way not every site down the line gets kicked, and site owners won’t immediately go into panic mode.

Duplicate Content in New Website Creation

When you’ve decided to build yourself a new site, whether it be due to needing an update, or if you’re just looking for a new image there’s a very important step to monitor. You need to ensure, that before you get too far into the process that you’re not making a rookie mistake and allowing the search engines to index both versions of your website. Doing so, can cause you grief and could ulimately penalize both websites for duplicate content.

When you’ve begun working on the newest version of your site, you need to ensure that it’s not being indexed by the search engines so you can work all you like without worry. The simplest way would be to use your htaccess file to block the bots, or alternatively if you have the means, you could work on a local server where the site isn’t techinically on the internet. Duplicate content can cause Google or Bing not to know which page it should list in response to a search. The search engines suddenly have two versions of your website and content to consider, and need to determine which it feels is the most relevant of the two. Seeing as your old site originally had the content, you stand to injure your brands reputation and new url simply by working on a new site or look.

Duplicate content isn’t just a concern when you’re working on your own website, it’s actually a point you should make note to occaisionally monitor. A bothersome trait and a difficult problem to tackle is if your own, original content ends up being scraped by a bot and winds up on an aggregator site. You can search for your own content by searching for key phrases and terms which you’ve used within the content and/or title, and hopefully the only sites which come up are your own or those you’ve given permission too to reproduce it. Typically scraper sites don’t rank that highly in search anymore, however there are still occasions where they do show up higher in the results than the original creators. When this happens, you often become trapped in a terrible cycle of trying to have your own, hard earned content removed from the index, and having credit given where credit is due.

Blekkos Monstrous Growth

While most in the search industry fluctuate within a few points, over the last few months Blekko has enjoyed a huge increase in traffic. Since the beginning of the year, Blekko has enjoyed 350% plus gain in traffic, expecting to reach 400% by the end of the month. These are all unique IPs accessing the site, to conduct searches and likely SEOs taking advantage of the tools they have available.

Blekko was already enjoying a slow and steady growth in 2011, averaging just over 1.5 million uniques in the month of December. But flipping the calendar page to 2012, seemed to herald a new beginning for the slash tag search engine. The Uniques for the first month of the new year doubled what they had seen in December and broke the 3 million mark. And while the initial information came from Blekko themselves, casting a little bit of a shadow on the information, it’s since been learned that while the actual numbers aren’t known, the growth is there.

There are always shifts on the web, new sites grow, old sites decline in traffic, but sudden, massive growth like Blekko is experiencing should also be taken with a grain of salt. Those in charge of the company offered a few reasons why they feel they’ve experienced such explosive growth in the new year, and the one which is probably the largest reasons is they’ve taken the time to make their presence known. The company has made it a point to attend major conferences to tout their strengths, and it shouldn’t be too much of a surprise to see they’re experiencing higher growth than previously. They also listed their recent upgrades as a reason for the sustained growth, which helped deliver an improved index for people who use the engine and build their slash tags. On the technical side of the equation, with the loss of Yahoo site explorer and the new tools which Blekko offers, they’ve undoubtedly seen an increase in traffic to that area of their site as well.

Competition in the search space is a great thing to be happening, and Google has said previously they welcome it. It encourages change, growth, and an ever expanding choice in what the public can use.

Search – Keep Private or No?

There’s a new type of search engine making a debut on the web dubbed Trapit. It’s unique in it’s own right simply because of the premise it has been built on, by learning what it is that you search for it delivers similar results for you to look through.

It’s not an unheard of idea, or even really a unique one at that, Trapit however takes a step further and tries to make educated guesses as to your preferences. It’s the same kind of algorithm that Apple’s new Siri technology uses to deliver your answers to you as you ask for them. Trapit does specifically type cast itself as a discovery engine, not a search engine, that doesn’t preclude what they have deemed to be an upcoming competition with Google out of the picture. Trapit co-founder Gary Griffiths called Google an online yellow pages, saying that it works well for direct queries but not for getting to new content.

It’s an interesting idea and a different perspective on delivering search results to be sure. But it’s a rather curious thought that general users are in so far okay with the way Trapit works. The puzzlement is coming from remembering the public enjoys being able to have their privacy protected, as they should. And that there have been more than one concern or complaint registered in Google’s realm about privacy and about how your search terms are saved and or indexed as part of your search history. My question to the early adopters and testers of Trapit would be then: How do you expect that Trapit learns what you may enjoy? It saves your searches, either on a cookie on your computer or within their members database and extrapolates from their via it’s algoritm.

But then again, it seems that it’s alright for a little player out there to have access to your searches and (potentially) information, but not the big guys who are frequently held accountable. Perhaps it’s just another case of wanting to eat your cake and have it too.