Category Archives: internet marketing

Bing Gaining on Google?

Currently Bing is going through a transformation of sorts, they’ve revamped their look and performance, changed up the way they do social, and tried to streamline everything overall. The current end result: in their own internal testing they’ve come out ahead of Google. A near 10% gain while Google lost 10% of their score during testing, so what’s Bing been up too?

Firstly, they’ve been working hard at incorporating more of the social web, into your search results. Earlier in the year, Google introduced their version of this idea as Search+ your world, and was met with the ire of masses. The claim was made that Google was favoring their own social network and shunning Facebook and Twitter, with Google counter arguing that they couldn’t gather information from those sources. Bing currently, manages to pull information on searches from all of these sources, Facebook, Twitter, as well as Google+. It may seem as though Google was just blowing hot air, but it needs to be mentioned that late last year Twitter did effectively block the search engine, and Facebook keeps a pretty tight handle on what gets out onto the web, even with open and social profiles. Microsoft Bing, currently has deals worked out with both of these parties to index their information, and Google+ profiles, if they’re set to public then everything on that page is indexed as a public website.

Bing used to have your social mixed in with your search results, but they decided to change that idea and went in a completely different direction. All of the social search results have been shoved off to the right side of your screen, where your friends, family and colleagues are ranked as per relevancy based on your search. Also included in those social results are people and items which may also be relevant to your search. The reason for the change according to Bing, is having the social results mixed in with organic, they felt that it diluted the page too much, and your searches would be affected.

So where does that leave us, Bing is in the process of launching their completely revamed search and social service, and they’ve made big gains in the search world, based on their own internal testing. A blog post on that point makes it a little clearer:

We regularly test unbranded results, removing any trace of Google and Bing branding. When we did this study in January of last year 34% people preferred Bing, while 38% preferred Google. The same unbranded study now shows that Bing Search results now have a much wider lead over Google’s. When shown unbranded search results 43% prefer Bing results while only 28% prefer Google results.

Ever Changing Web Technology

Along with all things, changes to the way we use the internet happen on a daily basis for the most part. Starting from a single browser interface, to now having a half a dozen available to use depending on preference and platform, web tech has been changing and evolving almost as fast as the web itself.

Take browsers for example, just a few years ago in 2008, the online world was dominated by Internet Explorer, followed up by Firefox and just a sprinkle of the odd ones here and there. That was the year that Google Chrome was introduced, and since that time, the number one seeds have changed some. As of the start of 2012, there is a fairly even split of the browser market going to the top 2, Firefox and Chrome as the most widely used, Internet Explorer coming in at a distant third and the rest, still just a smattering on the internet landscape. As of March 2012, Internet Explorer has dipped under 20% of the browser landscape, thankfully at least half of that market uses an updated version of the browser, with version 8.

But browsers aren’t the only change we’ve had in the last few years online, social media has become a massive market on the web. The largest player in the space needs no introduction, Facebook entirely crushes the social market with having around a half billion users logged in on average per day. The unencouraging portion of that number however, is that nearly half of the businesses out there, don’t even use social media marketing to their advantage. Only about 20% of the businesses out there are even using Facebook to push their brand and market, with the smaller business owners more readily embracing the technology. Knowing it’s an avenue that needs to be explored, and taking that step to do so are two different things, and it seems that a lot of the time it’s people that try and make it complicated. Any concern for marketing is return on investment, and while organic search engine optimization is the best return in the business, it’s cost and time factors make it difficult for those with very shallow pockets. Freebie advertising though, like that can be found with Facebook and Twitter, can be easily measured however, broadcast your ad/tweet, and measure your traffic over the next couple of days. It’s not magic, it’s simple math when you have to keep it basic.

The Goliath Complex of SEO

The goals of SEO are relatively simple, to make your site rank as highly as possible within the search pages for your niche. Whether you build houses, write stories, or draw pictures, search engine optimization is applicable for any website online. What a lot of smaller business owners can also use SEO for, is to knock the big players down a peg or two.

It’s an important step for all parties to consider SEO as a great equalizer online, you do however have to remember to stay within the rules. There are billions of web pages online, and yet with that daunting number in mind it’s still a relatively simple process to stay within the sights of the search engines. All you really need to keep in mind are the basics, even just following the best practices guidelines gives your website a shot at being picked up and indexed. But you need to also remember, the internet isn’t exactly a friendly place yet, a great deal of the web is free and wild. As a small example, you can’t control what websites choose to link to you if they choose too. This can be a difficult hurdle to overcome as well, as irrelevant, or inappropriate back links leading to your website can seriously hamper any SEO efforts you may have in place. This is only a single element of what’s known as negative SEO.

The larger, more established and authoritative sites such as Amazon are somewhat safer in this regards, however no one is completely immune to negative SEO. Negative search engine optimization can be defined as spammy links, blatant keyword stuffing, duplicate content or anything that isn’t considered white hat SEO by the search engines. Smaller, newer sites unfortunately are more susceptible to negative optimization problems. In the beginnings of a sites growth, it may not have much content or links pointing to it. If you’re not careful with how you craft your content or structure your links and navigation, you may even get dinged as having duplicate or irrelevant content in your niche. The number one point however that you need to keep in the forefront of your mind though, because the internet is still wildly untamed, the playing field is actually relatively plain and simple. Follow the rules, manage your website and monitor your content to make sure it doesn’t get scraped or that it has been copied from another resource. Even the big hitters can be taken down online, no target is too big or too relevant on the web.

Google’s Penguin Attacks

So the large update that Google pushed out late last week, which has a name you can now curse – Penguin, has had it’s share of folks caught in the crossfire and been down ranked. In case you were wondering what the update was about, the short version of the update is it was targetted at directly reducing webspam, and sites which use “aggressive spam” tactics.

As always, Matt Cutts came out on his white horse maintaining that so long as you create quality, original content, and stick to the best practices, that you should be alright with this new update. What has been discovered over the weekend however, and something that site owners couldn’t entirely be prepared for, was the effect that would be felt by targetting spammy sites. While as a site owner and web admin you can control what content is contained within your site, you unfortunately, have very little control as to who, or what, links to your site.

Larger online brands have felt little change at the moment with the update, but that doesn’t help any of the smaller sites out on the web. While Google mentioned that only 3% of the search results would be effected, it seems as the week gets underway that number will be a tad higher. The notable sites which have been cropping up in discussions tend to be smaller e-stores which are using shared, or affiliate information. In an affiliate layout, already not one of search engines favourites, if any one site in the chain adopts bad practices, then the down ranking factors will eventually get to your site as well.

Amid all of the uproar of sites being downed in the rankings or even in some cases, completely lost, there have been some valid suggestions. One of the most basic, and most helpful would likely be that instead of Google hurting anyone for being linked in a bad chain, simply remove any ranking or relevancy of the original, infringing domain. At least then that way not every site down the line gets kicked, and site owners won’t immediately go into panic mode.

Duplicate Content in New Website Creation

When you’ve decided to build yourself a new site, whether it be due to needing an update, or if you’re just looking for a new image there’s a very important step to monitor. You need to ensure, that before you get too far into the process that you’re not making a rookie mistake and allowing the search engines to index both versions of your website. Doing so, can cause you grief and could ulimately penalize both websites for duplicate content.

When you’ve begun working on the newest version of your site, you need to ensure that it’s not being indexed by the search engines so you can work all you like without worry. The simplest way would be to use your htaccess file to block the bots, or alternatively if you have the means, you could work on a local server where the site isn’t techinically on the internet. Duplicate content can cause Google or Bing not to know which page it should list in response to a search. The search engines suddenly have two versions of your website and content to consider, and need to determine which it feels is the most relevant of the two. Seeing as your old site originally had the content, you stand to injure your brands reputation and new url simply by working on a new site or look.

Duplicate content isn’t just a concern when you’re working on your own website, it’s actually a point you should make note to occaisionally monitor. A bothersome trait and a difficult problem to tackle is if your own, original content ends up being scraped by a bot and winds up on an aggregator site. You can search for your own content by searching for key phrases and terms which you’ve used within the content and/or title, and hopefully the only sites which come up are your own or those you’ve given permission too to reproduce it. Typically scraper sites don’t rank that highly in search anymore, however there are still occasions where they do show up higher in the results than the original creators. When this happens, you often become trapped in a terrible cycle of trying to have your own, hard earned content removed from the index, and having credit given where credit is due.

Youtube Wins and Loses in Germany

Today overseas in Germany, Google both won, and lost a court case with Youtube. How can that happen? Well it’s an interesting case, one which, if the verdict is upheld, will be used as a marker case for future dealings with the platform.

Google has long contended that Youtube is simply a host, not a creator of the content which you can find on it. Because anyone can create an account and upload anything they want, Youtube is by definition a host for content. There are some very basic editing tools on the site, but you can’t record anything on, or through their site or software. And today a court in Germany has ruled that (currently) Google needs to install filters within Youtube, in order to detect and stop people from gaining access to materials to which they do not own the rights. The judge also said that Google is not responsible for the uploaded material, merely needs to help do more to help stop copyright violations. That, is how you can win, and lose at court. Google and Youtube were legally absolved of being responsible for the content on it’s service, and were instead charged with helping to clean it up.

In the list of small victories as well as being told Youtube isn’t responsible for the content being uploaded? They were also saved from having to sort through it’s entire catalogue and purge anything that has a copyright tied too it. With billions of hours of video, that would be an impossible undertaking at best guess. Just because the case has been decided, Youtube and Google, aren’t exactly taking it lieing down. They still intend to appeal the decision, as any loss can be viewed as a loss. The GEMA party in Germany which controls royalty payments to materials it has copyrighted is the company which took Youtube to court, over 12 songs uploaded in 2010. Google has said they will be negotiating with the company so artists which have been copyrighted receive their due.

Blekkos Monstrous Growth

While most in the search industry fluctuate within a few points, over the last few months Blekko has enjoyed a huge increase in traffic. Since the beginning of the year, Blekko has enjoyed 350% plus gain in traffic, expecting to reach 400% by the end of the month. These are all unique IPs accessing the site, to conduct searches and likely SEOs taking advantage of the tools they have available.

Blekko was already enjoying a slow and steady growth in 2011, averaging just over 1.5 million uniques in the month of December. But flipping the calendar page to 2012, seemed to herald a new beginning for the slash tag search engine. The Uniques for the first month of the new year doubled what they had seen in December and broke the 3 million mark. And while the initial information came from Blekko themselves, casting a little bit of a shadow on the information, it’s since been learned that while the actual numbers aren’t known, the growth is there.

There are always shifts on the web, new sites grow, old sites decline in traffic, but sudden, massive growth like Blekko is experiencing should also be taken with a grain of salt. Those in charge of the company offered a few reasons why they feel they’ve experienced such explosive growth in the new year, and the one which is probably the largest reasons is they’ve taken the time to make their presence known. The company has made it a point to attend major conferences to tout their strengths, and it shouldn’t be too much of a surprise to see they’re experiencing higher growth than previously. They also listed their recent upgrades as a reason for the sustained growth, which helped deliver an improved index for people who use the engine and build their slash tags. On the technical side of the equation, with the loss of Yahoo site explorer and the new tools which Blekko offers, they’ve undoubtedly seen an increase in traffic to that area of their site as well.

Competition in the search space is a great thing to be happening, and Google has said previously they welcome it. It encourages change, growth, and an ever expanding choice in what the public can use.

Basic Website Knowledge 101

There are a few basic rules and ideas that you should always keep in mind when working on the web. Sometimes, it doesn’t matter how often you’ve done the same steps before, you make a mistake. Depending on the severity you can take down a website, mess up a web page, or you could make just minor little code mistakes which mess up your page layout in the odd browser.

One of the most basic points to keep in mind while working on your website is to keep it simple. A lesser repeated, but just as important lesson is to always backup your work. No matter how basic or simple your steps may be, you should always keep a backup before you push your changes live. It’s a simple mistake to not keep a backup of your original site or content before getting to work on it, one which can cost you more work if you’re not careful. Even seasoned coders make mistakes, and when they happen, a blog for example *cough*, can be offline until a backup can be restored.

But enough about completely crashing a website or losing content and materials, there are small errors you can make which can actually hamper your site as well which aren’t as immediately obvious. If you’ve been rewriting your simple tags, say your title, description and keywords (yes I know, the internet says they don’t really matter anymore), and you happen to mix them up with the wrong content, you could experience a negative impact in your rankings. And even a loss of a single position in the search results can equate to lost conversions. Another common error, one which doesn’t directly impact your rankings and website performance and is a tad more difficult to detect, is mis-tagging elements on your pages. It may seem a small, and innocuous step to miss in a website or page, but every little thing does add up. And when it comes to optimization and your online competition, every little bit helps.

Google Saving Your Advertising Dollars

It’s somewhat common knowledge that when someone performs a search, there will be a box of “Sponsored results” to the left, above, and sometimes even below the organic results. Bing has a paid service, as does Yahoo and Google has their AdWords which proved a business in search can be profitable. There’s a discussion lately surrounding paid search advertising and the big 3 search engines, and if you’re not careful with how you read it, you may walk away with the wrong idea.

Compared to this time last year, the CPC for Google has fallen again, for the second quarter in a row while Bing and Yahoo’s CPC have continued to climb. On the surface it’s a statement which can make it sound like Bing and Yahoo have been managing to grab ad space from Google. The point closer to the truth however is more to the tune that Google has become an even better choice to advertise with, as opposed to Bing and Yahoo. Search engine marketing via the AdWords platform or one like it, has to be measured differently than the organic results, you can’t take positioning as the end goal.

When you begin to break down the numbers involved in SEM and SEO, there are some key differences that you need to understand. They both depend on conversion rates, because without converting your traffic, you’re wasting time and money. One of the largest, and most important difference however is the click through rate of your positioning. You could be ranked at the very top of the AdWords results, but if you have a poorly written ad, or a poorly built website, chances are your conversions will be limited.

Another major point you need to keep in mind is cost per click, or CPC as was being discussed earlier. Where paid advertising is concerned, CPC is a literal interpretation of how much it is costing you to have someone click on your listing. Organic SEO is more difficult to define, as you’re not paying each time someone clicks your organic listing, but after a few months you can more easily break it down. A high cost per click for your search term can mean that there are many people in the same space, or, it can mean that one of your competitors is driving up the bid on the keyword to try and gain dominance. A declining average cost per click isn’t necessarily a bad omen, it can point to reduced competition, it can also mean an improved conversion rate.

Google: Always Maintain Your Website

Occasionally, if you’re diligent about handling your website and checking in on Webmaster Tools, you’ll get the odd warning. Most of the time they’re not major, maybe your sitemap is old, your navigation has an error or you have an erroneous line in your robots.txt file blocking crawling on your site. But for people who’ve been maybe a little, naughty with acquiring back links, maybe getting caught buying links and increasing their Page Rank, receiving notice in your WMT is only the first step of the work you need to do.

There is some great information to be found, direct from Google itself about how to handle being called out by having unnatural links pointing back at your site. When you get a warning like this, you’ll also get a notice that you’ll have the penalty attached for 6 months, but just because you have it doesn’t mean you need to grit your teeth and bear it for that long. If you’ve acted quickly, and cleaned up all of the errors that have been reported and are serious about your online positioning, you need to submit a reconsideration request as soon as possible. Sitting and waiting out the penalty doesn’t just affect your site in the short term, it will also affect your positioning, and possibly your reputation in the long run.

If you’ve been flagged as having unnatural links pointing at your site, you need to go as far back as the links go. If you’ve been working them for a year, clean up the last years links. If it’s two, five, or even ten years of links, it needs to be dealt with. That means a massive undertaking, but this is your online presence, and possibly the survival of your online business. Taking the time to clean up all of the links leading into your site is time invested into the well being of your company.

The last piece of pertinent advice was just as important as well, just because you may have received a notice today, doesn’t mean it’s only just now been noticed. Google has only recently been actively sending out reports to site owners, so just because you may have received notice that your pages aren’t crawling properly, doesn’t mean they never noticed before. The issue you’ve recently become aware of, just may be the reason you’ve never been able to hit page 1, or over take a competitor in your online market. Taking action on your report and quickly submitting for reconsideration is not only the best course of action, it should be viewed as the only course of action after receiving a notice.