Browsing "internet marketing"
Since we covered the very basics of how web developers, designers, business owners and SEOs could work together a little better yesterday, lets get into a tad more detail. Taking it a little slower, we’ll just discuss a handful of some of the terms you’re going to run into when working with a search engine optimization firm.
Once we’ve had the chance to take a good hard look at your website, one of the first few things you’ll find us talking about is about conducting keyword research. Basically all this means to you as a website owner, is we need to know what terms you’re interested in ranking with, and we’ll break down your content to see if those keywords exist in a workable combination. It’s also a step taken when we search for your current listings, and breaking down how you stack up versus your competitors. It’s a simple step, one which gets abused at times unfortunately when some believe that spamming their keyword as many times as possible is a good thing. Also tieing into your website and it’s current performance, is Page Rank. It’s actually not as huge a metric as it once was, but it’s a ranking system created by Google’s Larry Page which gives your site a number based on a number of factors. Authority of incoming links, the quality of your content and website, and this rank is passed on through out your site. It used to seem that the higher your page rank, the higher you sit in the SERPs, but Google hasn’t been as diligent in up keeping their system, with Panda and Penguin being introduced in the last couple of years.
Once we’ve determined what you want to rank for, how you stack up currently in your niche market and where to focus our efforts, you’re going to start hearing terms like geo-targeting, and click through rate a whole lot. Geo-targeting is the process of constructing your website and it’s pages, to be specifically relevant to certain areas. You can easily work in city geo-targeting into your site with adjustments to content, and you can even drill down into neighborhoods if you begin to use tools like AdWords etc. With targeting your website, you ensure that you’re working at capturing your target market, and increasing your over all click through rate. Click through rate, loosely defined, is the percentage of searchers who click on your link after performing a search. It’s a great metric to keep measurement of, as it can fairly quickly outline for you if a new campaign, or advertising strategy has had a positive or negative effect on your brand and business.
There is a huge amount of information and due diligence that needs to be done when you’re working on your companies website. You need to consider the technical, and aesthetic aspects of your site. Is it appealing to look at, or is it full of uncoordinated colors and themes. On the technical side of your site, does it load quickly, or have you filled it with pictures, videos, and sounds and it takes more than a few seconds to load? The internet doesn’t work around minutes, if you do not capture your audience in the first 3-5 seconds of being on your site that impression is lost. Take a good hard look at your navigation, your menu structure on your site. Is it written with clean code, easily crawlable and indexible or have you built it with scripts and images which have no relevance with search engines.
Have someone outside your business take a read of the content on your website, are they able to work out what it is that you do with just a quick glance? Often times, when a company is building a new site for themselves they can get carried away with their content, and they begin to create content which is too niche specific, resulting in lost visitors and relevance. Once you’ve gotten all of your content squared away, you can create proper links to your other web pages you’ve built, to try and help the bots to get at all of your available content and help push you that much further up the ranks. Think of your website like a sail boat, and your additional pages and content as added sails. The better job you do building them, the more power you’ll have behind you.
We’ve always maintained that those with certain skill sets should do certain jobs and stick to those jobs. Web designers should design, coders stick to coding, SEO’s stick to SEO and so on and so forth. There is however, one book that everyone should read and keep handy when building, repairing, or working on a website. Go hit up the Google Webmaster Guidelines for best practices for websites. It’s the best stepping stone you can use to begin to have a chance online, and while it won’t help you rank #1 for all of your niche terms, it will keep you from being targetted by their biggest updates; ala Panda and Penguin.
Currently Bing is going through a transformation of sorts, they’ve revamped their look and performance, changed up the way they do social, and tried to streamline everything overall. The current end result: in their own internal testing they’ve come out ahead of Google. A near 10% gain while Google lost 10% of their score during testing, so what’s Bing been up too?
Firstly, they’ve been working hard at incorporating more of the social web, into your search results. Earlier in the year, Google introduced their version of this idea as Search+ your world, and was met with the ire of masses. The claim was made that Google was favoring their own social network and shunning Facebook and Twitter, with Google counter arguing that they couldn’t gather information from those sources. Bing currently, manages to pull information on searches from all of these sources, Facebook, Twitter, as well as Google+. It may seem as though Google was just blowing hot air, but it needs to be mentioned that late last year Twitter did effectively block the search engine, and Facebook keeps a pretty tight handle on what gets out onto the web, even with open and social profiles. Microsoft Bing, currently has deals worked out with both of these parties to index their information, and Google+ profiles, if they’re set to public then everything on that page is indexed as a public website.
Bing used to have your social mixed in with your search results, but they decided to change that idea and went in a completely different direction. All of the social search results have been shoved off to the right side of your screen, where your friends, family and colleagues are ranked as per relevancy based on your search. Also included in those social results are people and items which may also be relevant to your search. The reason for the change according to Bing, is having the social results mixed in with organic, they felt that it diluted the page too much, and your searches would be affected.
So where does that leave us, Bing is in the process of launching their completely revamed search and social service, and they’ve made big gains in the search world, based on their own internal testing. A blog post on that point makes it a little clearer:
We regularly test unbranded results, removing any trace of Google and Bing branding. When we did this study in January of last year 34% people preferred Bing, while 38% preferred Google. The same unbranded study now shows that Bing Search results now have a much wider lead over Google’s. When shown unbranded search results 43% prefer Bing results while only 28% prefer Google results.
Along with all things, changes to the way we use the internet happen on a daily basis for the most part. Starting from a single browser interface, to now having a half a dozen available to use depending on preference and platform, web tech has been changing and evolving almost as fast as the web itself.
Take browsers for example, just a few years ago in 2008, the online world was dominated by Internet Explorer, followed up by Firefox and just a sprinkle of the odd ones here and there. That was the year that Google Chrome was introduced, and since that time, the number one seeds have changed some. As of the start of 2012, there is a fairly even split of the browser market going to the top 2, Firefox and Chrome as the most widely used, Internet Explorer coming in at a distant third and the rest, still just a smattering on the internet landscape. As of March 2012, Internet Explorer has dipped under 20% of the browser landscape, thankfully at least half of that market uses an updated version of the browser, with version 8.
But browsers aren’t the only change we’ve had in the last few years online, social media has become a massive market on the web. The largest player in the space needs no introduction, Facebook entirely crushes the social market with having around a half billion users logged in on average per day. The unencouraging portion of that number however, is that nearly half of the businesses out there, don’t even use social media marketing to their advantage. Only about 20% of the businesses out there are even using Facebook to push their brand and market, with the smaller business owners more readily embracing the technology. Knowing it’s an avenue that needs to be explored, and taking that step to do so are two different things, and it seems that a lot of the time it’s people that try and make it complicated. Any concern for marketing is return on investment, and while organic search engine optimization is the best return in the business, it’s cost and time factors make it difficult for those with very shallow pockets. Freebie advertising though, like that can be found with Facebook and Twitter, can be easily measured however, broadcast your ad/tweet, and measure your traffic over the next couple of days. It’s not magic, it’s simple math when you have to keep it basic.
The goals of SEO are relatively simple, to make your site rank as highly as possible within the search pages for your niche. Whether you build houses, write stories, or draw pictures, search engine optimization is applicable for any website online. What a lot of smaller business owners can also use SEO for, is to knock the big players down a peg or two.
It’s an important step for all parties to consider SEO as a great equalizer online, you do however have to remember to stay within the rules. There are billions of web pages online, and yet with that daunting number in mind it’s still a relatively simple process to stay within the sights of the search engines. All you really need to keep in mind are the basics, even just following the best practices guidelines gives your website a shot at being picked up and indexed. But you need to also remember, the internet isn’t exactly a friendly place yet, a great deal of the web is free and wild. As a small example, you can’t control what websites choose to link to you if they choose too. This can be a difficult hurdle to overcome as well, as irrelevant, or inappropriate back links leading to your website can seriously hamper any SEO efforts you may have in place. This is only a single element of what’s known as negative SEO.
The larger, more established and authoritative sites such as Amazon are somewhat safer in this regards, however no one is completely immune to negative SEO. Negative search engine optimization can be defined as spammy links, blatant keyword stuffing, duplicate content or anything that isn’t considered white hat SEO by the search engines. Smaller, newer sites unfortunately are more susceptible to negative optimization problems. In the beginnings of a sites growth, it may not have much content or links pointing to it. If you’re not careful with how you craft your content or structure your links and navigation, you may even get dinged as having duplicate or irrelevant content in your niche. The number one point however that you need to keep in the forefront of your mind though, because the internet is still wildly untamed, the playing field is actually relatively plain and simple. Follow the rules, manage your website and monitor your content to make sure it doesn’t get scraped or that it has been copied from another resource. Even the big hitters can be taken down online, no target is too big or too relevant on the web.
So the large update that Google pushed out late last week, which has a name you can now curse – Penguin, has had it’s share of folks caught in the crossfire and been down ranked. In case you were wondering what the update was about, the short version of the update is it was targetted at directly reducing webspam, and sites which use “aggressive spam” tactics.
As always, Matt Cutts came out on his white horse maintaining that so long as you create quality, original content, and stick to the best practices, that you should be alright with this new update. What has been discovered over the weekend however, and something that site owners couldn’t entirely be prepared for, was the effect that would be felt by targetting spammy sites. While as a site owner and web admin you can control what content is contained within your site, you unfortunately, have very little control as to who, or what, links to your site.
Larger online brands have felt little change at the moment with the update, but that doesn’t help any of the smaller sites out on the web. While Google mentioned that only 3% of the search results would be effected, it seems as the week gets underway that number will be a tad higher. The notable sites which have been cropping up in discussions tend to be smaller e-stores which are using shared, or affiliate information. In an affiliate layout, already not one of search engines favourites, if any one site in the chain adopts bad practices, then the down ranking factors will eventually get to your site as well.
Amid all of the uproar of sites being downed in the rankings or even in some cases, completely lost, there have been some valid suggestions. One of the most basic, and most helpful would likely be that instead of Google hurting anyone for being linked in a bad chain, simply remove any ranking or relevancy of the original, infringing domain. At least then that way not every site down the line gets kicked, and site owners won’t immediately go into panic mode.
When you’ve decided to build yourself a new site, whether it be due to needing an update, or if you’re just looking for a new image there’s a very important step to monitor. You need to ensure, that before you get too far into the process that you’re not making a rookie mistake and allowing the search engines to index both versions of your website. Doing so, can cause you grief and could ulimately penalize both websites for duplicate content.
When you’ve begun working on the newest version of your site, you need to ensure that it’s not being indexed by the search engines so you can work all you like without worry. The simplest way would be to use your htaccess file to block the bots, or alternatively if you have the means, you could work on a local server where the site isn’t techinically on the internet. Duplicate content can cause Google or Bing not to know which page it should list in response to a search. The search engines suddenly have two versions of your website and content to consider, and need to determine which it feels is the most relevant of the two. Seeing as your old site originally had the content, you stand to injure your brands reputation and new url simply by working on a new site or look.
Duplicate content isn’t just a concern when you’re working on your own website, it’s actually a point you should make note to occaisionally monitor. A bothersome trait and a difficult problem to tackle is if your own, original content ends up being scraped by a bot and winds up on an aggregator site. You can search for your own content by searching for key phrases and terms which you’ve used within the content and/or title, and hopefully the only sites which come up are your own or those you’ve given permission too to reproduce it. Typically scraper sites don’t rank that highly in search anymore, however there are still occasions where they do show up higher in the results than the original creators. When this happens, you often become trapped in a terrible cycle of trying to have your own, hard earned content removed from the index, and having credit given where credit is due.
Today overseas in Germany, Google both won, and lost a court case with Youtube. How can that happen? Well it’s an interesting case, one which, if the verdict is upheld, will be used as a marker case for future dealings with the platform.
Google has long contended that Youtube is simply a host, not a creator of the content which you can find on it. Because anyone can create an account and upload anything they want, Youtube is by definition a host for content. There are some very basic editing tools on the site, but you can’t record anything on, or through their site or software. And today a court in Germany has ruled that (currently) Google needs to install filters within Youtube, in order to detect and stop people from gaining access to materials to which they do not own the rights. The judge also said that Google is not responsible for the uploaded material, merely needs to help do more to help stop copyright violations. That, is how you can win, and lose at court. Google and Youtube were legally absolved of being responsible for the content on it’s service, and were instead charged with helping to clean it up.
In the list of small victories as well as being told Youtube isn’t responsible for the content being uploaded? They were also saved from having to sort through it’s entire catalogue and purge anything that has a copyright tied too it. With billions of hours of video, that would be an impossible undertaking at best guess. Just because the case has been decided, Youtube and Google, aren’t exactly taking it lieing down. They still intend to appeal the decision, as any loss can be viewed as a loss. The GEMA party in Germany which controls royalty payments to materials it has copyrighted is the company which took Youtube to court, over 12 songs uploaded in 2010. Google has said they will be negotiating with the company so artists which have been copyrighted receive their due.
While most in the search industry fluctuate within a few points, over the last few months Blekko has enjoyed a huge increase in traffic. Since the beginning of the year, Blekko has enjoyed 350% plus gain in traffic, expecting to reach 400% by the end of the month. These are all unique IPs accessing the site, to conduct searches and likely SEOs taking advantage of the tools they have available.
Blekko was already enjoying a slow and steady growth in 2011, averaging just over 1.5 million uniques in the month of December. But flipping the calendar page to 2012, seemed to herald a new beginning for the slash tag search engine. The Uniques for the first month of the new year doubled what they had seen in December and broke the 3 million mark. And while the initial information came from Blekko themselves, casting a little bit of a shadow on the information, it’s since been learned that while the actual numbers aren’t known, the growth is there.
There are always shifts on the web, new sites grow, old sites decline in traffic, but sudden, massive growth like Blekko is experiencing should also be taken with a grain of salt. Those in charge of the company offered a few reasons why they feel they’ve experienced such explosive growth in the new year, and the one which is probably the largest reasons is they’ve taken the time to make their presence known. The company has made it a point to attend major conferences to tout their strengths, and it shouldn’t be too much of a surprise to see they’re experiencing higher growth than previously. They also listed their recent upgrades as a reason for the sustained growth, which helped deliver an improved index for people who use the engine and build their slash tags. On the technical side of the equation, with the loss of Yahoo site explorer and the new tools which Blekko offers, they’ve undoubtedly seen an increase in traffic to that area of their site as well.
Competition in the search space is a great thing to be happening, and Google has said previously they welcome it. It encourages change, growth, and an ever expanding choice in what the public can use.
There are a few basic rules and ideas that you should always keep in mind when working on the web. Sometimes, it doesn’t matter how often you’ve done the same steps before, you make a mistake. Depending on the severity you can take down a website, mess up a web page, or you could make just minor little code mistakes which mess up your page layout in the odd browser.
One of the most basic points to keep in mind while working on your website is to keep it simple. A lesser repeated, but just as important lesson is to always backup your work. No matter how basic or simple your steps may be, you should always keep a backup before you push your changes live. It’s a simple mistake to not keep a backup of your original site or content before getting to work on it, one which can cost you more work if you’re not careful. Even seasoned coders make mistakes, and when they happen, a blog for example *cough*, can be offline until a backup can be restored.
But enough about completely crashing a website or losing content and materials, there are small errors you can make which can actually hamper your site as well which aren’t as immediately obvious. If you’ve been rewriting your simple tags, say your title, description and keywords (yes I know, the internet says they don’t really matter anymore), and you happen to mix them up with the wrong content, you could experience a negative impact in your rankings. And even a loss of a single position in the search results can equate to lost conversions. Another common error, one which doesn’t directly impact your rankings and website performance and is a tad more difficult to detect, is mis-tagging elements on your pages. It may seem a small, and innocuous step to miss in a website or page, but every little thing does add up. And when it comes to optimization and your online competition, every little bit helps.