Tagged with " search engines"
More often than not you’re looking to have your website visibility increased when contracting an online branding agecny, but as a different train of thought, did you realize you can use them to hide your website as well from the search engines?
To clarify a tad, you don’t typically engage a marketing agency to hide your business, but there are cases where you would want to keep the search engines at bay. Depending on the way your site is constructed for example, you’ll want to keep them out of your image files, or perhaps you have an extensive pdf collection hosted on your site, and you’d rather keep them from being indexed as part of your website. Maybe it’s even a new website build being done to update your look and content, and that needs to be completely hidden from the search engines until you’re ready to go live, just in case the bots get confused – they’re not intelligent at all in this respect. For those reasons, I submit a short list, on how not to be found on the search engines.
One of the simplest and quickest ways to be hidden from the search engines is by editing your web hosts robots.txt file. Not all website owners have access to change this file, but when you can it can be used to restrict search engine behaviour like nothing else. With just a simple phrase, ‘disallow’, you can completely obscure your site or pages from being found by search engine spiders. But just as uncle Ben said ‘With great power comes great responsibility’ you need to be extremely careful with what you choose to disallow to the search engines, you may inadvertently block key components or pages of your site, rendering any search engine marketing efforts useless.
If the robots.txt change is too broad a stroke for your case, but you still have certain areas you would like to restrict the search engines from finding certain pages, you can use a meta string of code in the head of each webpage you’d like to keep from being indexed. It’s even as easy as it sounds, all you need to do is add the switch meta=noindex to hide a certain page, and there are a handful of additional switches which could be added to sculpt search engines flow through your site.
The two last methods for restricting bot activity on your website are the biggest guns in the game, and that’s using 404 pages for dead ends, and if you really get serious, 301 redirects to send bots and visitors alike to a different destination all together. These are the big redirects however, affecting your visitors and bots alike, and are not to be taken lightly. Sculpting your traffic flow entirely to a different destination is one that needs to be taken with care and foresight. One improperly executed 301 or 404 in your website, could bring your entire marketing campaign crumbling to the ground.
There have been a number of major updates recently and while the search game keeps changing, our message will always remain the same. Being found online is important, becoming a brand is the real goal.
A couple of the major updates recently seems to have shaken up a good deal of some of the marketers in the SEO field, especially with the change to the way that keywords (aren’t) relayed anymore via stats. The ‘not provided’ change happened just around a month ago now, and some of the light weights in the search world are feeling a pinch without that keyword data it seems. It doesn’t take much searching on any search engine to find a blog about the end of SEO now that you can’t work with the keyword data that the engines previously provided. Short answer to them from us here at Freshtraffic would be something along the lines of see you later then.
The PageRank is another point that seems to still make waves with online marketing providers too, as it hasn’t been updated in months. PageRank was a handy tool to quickly determine how valuable Google viewed a website, and a lot of people seem to have taken that to heart. It isn’t a stretch to think that with the keyword data shut off, and the PageRank no longer being updated, that the use a PageRank as a ranking factor will be going the way of the dinosaurs. Not entirely a bad thing however, as it can allow for a more objective search experience, and hopefully increased competition between brands.
The other major change that has happened in the last month has been the Hummingbird update. It has introduced the idea of contextual search to the web, so now you can actually use a phrase like “Who won the 1954 Stanley Cup” (the Detroit Red Wings by the way). It has introduced a method of search that conforms more to the way that a person might use in actual conversation with another, that you can use questions to find what you’re interested in instead of trying to be clever in your searches.
All of the major changes over the last year have been towards a more fluid use of the search engine that Google has brought to the web. It’s much more streamlined, and allows for real online branding opportunities to be drawn upon. Why you would want to push for building a brand as opposed to strictly internet marketing? We’ll delve into that canyon next time.
When you have your business online and are marketing towards a certain niche market, the more consistent and relevant you can make that marketing the greater your chances of success.
Consistency, and relevance are two of the more important factors that affect your online visibility, and surprisingly one of the most damaging things that you can do to a successful campaign is to rebuild your website. That’s not to say that a facelift to your website is always a bad thing, but you need to keep your audience in focus. Are you after general traffic as an informational site, or are you after qualified leads in order to make on site conversions and boost your bottom line.
Significant issues can arise with launching an updated version of your website not only from a search engine bot perspective, but even from your users end. While the web is always in a fluid state, if you suddenly revamp your website without notice to your end users you will experience a sudden, and likely slight, drop off in traffic and page loads. To handle any user experience loss one of the simplest measures you should take is to use a post or a banner with the note “We’re updating!” or something to that effect. What this does is it puts the comfort into their minds that there is a reason that the site looks different, and it’s not the end of the world if I can’t find something right away. Use a feedback form for returning and new visitors to gauge how well your new site is being taken, and also as a notice if there is something missing or not functioning correctly on the site.
From a search engine spider perspective, you need to be careful of how you build your new site. In a perfect world you can keep everything exactly the same as it was structurally, with the same urls and everything that goes with it. But since that’s probably very far from the truth, you need to apprise your SEO team of all changes, not just some of them. If you change a url for example from home.html to home.php, that needs to be communicated as search engines do not view them as the same page. I won’t even begin to get into the 301s that need to be considered, the idea is your online marketing team works through the entire site before it gets turned live to the world.
Professionals are contracted in all manner of business whether it is plumbers, carpenters, coders or marketers – the idea is they’re being paid for the knowledge that you do not possess. The first rule of thumb should always be when you’re given instructions to complete, it’s a good idea to listen as there is a reason they’re advising you.
When you’re designing a website for your business, you need to make sure that you’re making the correct decisions, and not putting yourself in the realm of always playing catch up with the search engine guidelines.
One of the biggest issues that we run into with new businesses and their websites is the content that they use, or the lack of it. It’s one of the most unpopular conversations to have with a client, but your content and the keywords you use within it can be the first, largest hole to help your site sink or swim. There is a fuzzy number out there, by that I mean there is no real concrete answer – only best guesses, and as to the proper amount, the search engines aren’t talking. On average, you should be delivering your site message and it’s keyterms somewhere in the range of 3-5% of your total content density. Quick and dirty explanation would be if you write a page about the color blue that is 100 words, you’ll want the word blue in there 3-5 times – sounds simple right? It really is, the problem is the other 95 words, they still have to fit the theme of the site/page and can’t just be gibberish, so try not to focus on numbers and shoot for well written, easy to read content for your site.
There are two other points dealing with content that we stress when speaking with our clients, one is the always growing social aspect of the web, and the other is spam, or the appearance of spam.
Where the social web is concerned, having your site and your PR people active on Twitter and Facebook is a good thing, as the jury is still out on just how much this benefits your SEO campaign. But you need to bear in mind that those efforts alone will not drive you up the search results pages. When you boil it all down to the roots, it’s your website content that will drive your site up the SERPs and brand your small business online. Social media leverage is great, but your content is paramount, don’t miss the forest for the trees so to speak.
One more point about your content, and it has some to do with the nature of keywords and the language you use when you put it all together. It’s an unfortunately common occurrence to find out that the reason your site has never done well with results is because with the way you’ve written your content it comes across as spam. It’s the difference between being spammy and readable that plays to your favor with the results pages. A quick and simple test to find out if your content is on the wrong side of that line, is have someone read it out loud to you. Trust your instincts about it, if it sounds funny you’re much further ahead rewriting your content instead of being caught on the wrong side of the search engines wrath. Don’t worry about the numbers and the figures about your content, make it readable, make it relevant, and the web will do the rest of the work.
Since everyone is still slightly reeling and recovering from the upgraded Penguin addition to the algorithm, it was at a recent conference where Matt Cutts spoke about the Panda portion, and elaborated a little on why we are no longer being directly advised that it has been running.
In case you happened to forget, the Panda portion of the algorithm is there to help determine just how high quality your website content is. It checks on things like originality, relevance, and whether or not it’s over optimized as well. The spokesmen for search engines have always maintained that the content that you create on your website should be for the users, not the robots, and Panda was their automatic process in that regard. When you cruise around the news and blogs of the SEO world over the last few years, you can always find the posts every few weeks where someone verified through Google that an update either was happening, or had recently happened. And just a short couple of months ago, Cutts came forward and said they were no longer advising whether or not Panda was being refreshed, or run across the web and their index. The reason is actually quite simple, at present, Panda will be running every month over the course of 10 days or so.
The web is nearly an immeasurable amalgamation of pages, gifs, movies, sounds, information, basically if you can imagine it, it’s on the web somewhere. And while some website owners were hit in a big way by Panda and Penguin, the increase in frequency of it running is a very good thing. In the past, every month or so Google would have a big shakedown of the index, hitting the pages hard with an updated version of the algorithm to try and clean out some of the spammy sites and those that weren’t abiding by the rules. Ever since they came up with the additions of Panda and Penguin, there have been some fairly small changes to the size of the index, impacting only a few percent of the total sites out there, but when you’re talking about trillions of pages, even a small amount can become huge. Panda running for 10 days of the month every month is a good thing, just think of it that way. It helps keep the index fresher and more spam free than it ever has before.
There are many steps to the optimization game, there is the content management side of things, social optimization which takes a lot of hands on time to properly handle, and there is the link building side of the equation. Just like when you work your content to be readable and highly relevant to your niche, you also have to make sure to properly build your back link profile. The level of detail needed to properly sculpt your profile is thankfully left to us SEO experts to make sure it’s done properly. If you would like to try it on your own for some unknown reason though, there are a few rules you need to keep in mind when searching around the web for a back link.
First and foremost one of the largest mistakes you could make when working on link building is to assume a back link from someone in your market or niche. Just because someone may share a complimentary aspect of your market, does not necessarily mean they would automatically like to link with you. By this point in the age of the web, it is thankfully a mostly lost tactic in the back link game. Occasionally you will receive an email saying that a website has created a link to your website, the very first thing you need to do is check the integrity of their website. If necessary webmaster tools can be used to disavow a link to your site if for some reason a request for removal has been ignored.
One of the fastest growing methods of contact between webmasters in the last couple of years has started to come down to social networks. With Facebook, Twitter, LinkedIn, Pinterest and Google+ there are more than a billion social signals out there with Facebook alone with which to get your name and address out there. You can create your content and flog it under your own banner and work to garner back links based on your own quality. One of the biggest reasons to build your back link profile this way is you are able to say with certainty that the credit for the links is due to your hard work, and not the result of spam emails. Social networking and building your business through the social channels of the web is one of the newer ways to generate buzz, mostly in the local scene but it is a highly effective form of marketing limited only by the time you are willing to put into it.
And one last important tactic to keep in mind when working on your back link profile, is do not fall for some of the contat emails which offer to sell you packages of links. In a best case scenario the links will be useless to your campaign, and at worst, it may serve to sink your website in the rankings. The negative side of this equation is much greater than the positive, as anything worth doing online is worth doing correctly, it can be difficult to recover from a bad link profile so it is best to stay away from easy solutions like an emailed offer.
Google is the biggest fish in the pond, of that there is no doubt as they retain somewhere north of 65% of all online search activity. There are others in the search game, Bing, Yahoo, Baidu, Blekko, all of which have their own little piece of the pie and serve their own version of the internet they have indexed. It has never been a surprise to see Google the target of anti-trust suits, targeted negative ads, and lawsuits over some really silly topics, and yet despite all of those things it is still the most widely used search engine on the web.
One of the major issues that competitors and detractors of the search engine enjoy bringing up, is the level of spam and even malware that searchers can sometimes find in the search results delivered by Google. A recent survey however, has turned up with some interesting results regarding the malware side of the equation at least. The most often touted second most used search engine out there, Bing, has come up on the wrong side of the malware results side. While it’s entirely true that Google delivers a great deal more search results in total, it is the Bing results which have the higher rate of dangerous links attributed to them. How bad can it really be though? Bing was recorded at having nearly five times as many malware links as compared to the results pages that Google delivers, and Yandex, which is the Russian equivalent of Google, gives up 10 times the bad links.
All of the search engines that were tested were found to consistently removing malware from their results pages, it just turns out that Google is doing something just that much better than everyone else. It came up that every search engine was doing all that it could to handle the malware pointed at it, but because each search engine is targeted in different ways, then the levels of malware will differ. The process of detecting websites serving malware hasn’t changed so much as the process for delivering them has. Malware programmers aren’t typically the most rules abiding bunch to begin with, so they are always looking for ways to circumvent the safe guards put in place by the search engines. As always, practice safe search and if a link seems too good to be true in the search results, it very likely is.
In the last few months Facebook has come out in the open about their own search offering, and if you are interested in trying it you can sign up for it. It’s an intriguing idea Graph search, but as numerous blogs and articles on the web quickly discovered, the results which are returned can be a little on the flaky side at times. You can even go so far as to somewhat toy with the search interface, and come up with some very unusual search settings as an example.
The service is still in its infancy, it has a lot of learning and lot of growing to do. One of the main complaints that has come up however, that has been consistent across all articles is the web search feature provided by Graph search is lacking. In fact, it’s lacking enough that it may as well be non-existent, so there were writers out there who had hoped that the service would improve over time. It seems that their prayers will go unanswered, for the time being at least, as Grady Burnett, VP of Global Marketing for Facebook said, in no uncertain terms there will be no external search engine. The actual quote from SMX West:
GB: I don’t see that happening. We called it “Graph Search” because we’re focused on letting people search the Facebook graph. So my answer would be no.
There is going to be a handful of different responses to this message from the company, some will be cheering, others will be jeering of course, but those in the search industry who truly understand, won’t be surprised at all. When you consider all aspects of the internet, not just search and social, a picture will begin to form. This map obviously isn’t an exact replication of what the internet looks like, or how it’s divided, but it makes it easier to understand, and see why Zuckerberg, who built the largest social network in existence, isn’t worried about external search at the moment. There is so much more out there that isn’t Google or Bing or Yahoo to worry about, they’re only a fraction of what makes the web so massive.
While Google is undoubtedly the largest search engine on the web with its trillion pages indexed, they are not the only tool out there with which to make your way around the web. But while there are hundreds of millions of web users out there, there is only a handful of search engines that really garner any real use.
Google, as mentioned previously clearly holds the dominant spot online and has for a number of years. With more than 2/3rds of the market share in search, it has an massive presence on the web. With the clout that they have with the worldwide market any business that has a website is keen to try and make a place for themselves on the front page. And the bigger the target, the more detractors one is bound to have, and Google definitely has the majority share. Privacy issues, a social platform that (at first) floundered and has grown somewhat stale, and a long list of competitors claiming anti-competitive behaviour it seems amazing that they could still be in business, but while they haven’t made friends with every user on the web, 66% is more than enough.
The second most widely used search engine is really two, as it delivers results for both bots, Bing and Yahoo gobble up the majority of the remaining search activity. The Yahoo results pages for more than a year so far have been provided by a Bing search bot, as opposed to running their own bot, and building their own index of websites on the web. And while this still allows the adopters of the Yahoo portal a way to browse the web, they’re not being delivered their own true results. The new CEO at Yahoo however, seeks to change all of that, hopefully 2013 has some shaking up in the search world. Bing as a search service has been trying hard for a couple of years to break into the market that Google dominates. With some clever ideas with image search, flyout snippets of search pages and sometimes widely differing results at times from Google, Bing has a share of the market that hasn’t shifted much in a number of years. Perhaps they can rekindle their search agreement with Facebook and together they can develop a full fledged social search service, only time will tell.
In the last little bit of the search world, you have some of the little guys who are trying to shake up the web. Blekko, one of the more interesting search services out there is a great way to pick your way through a search results page that bills itself as being spam free. Your experiences will vary wildly based on what, and how you search, but with their usage of what they describe as slashtag which allows you to greatly fine tune your search parameters. It’s an interesting technology and definitely gives a differing view of the web and it’s offerings. Another small fry in the search landscape, but one which can cater to those concerned with privacy is Duckduckgo. It has the same clean search ui as the others with a basic text input box, but it delivers you results from “outside the search bubble” they describe that other search engines put you in. It is a great option to have a look at what the web might look like with no search history to go on, the results can be interesting to say the least.
The steps to be able to rank your website effectively online are relatively simple, and can be broken down into a few very broad basics to follow. If you have a simple website, say a few pages detailing a local business for example, as long as you have a good title, strong content, and some kind of a social presence then you have most of the puzzle sorted out. The big time sink though, and usually the most difficult step to work out, is building up that backlink profile.
Building a proper backlink profile seems to have a lot of mystique surrounding it when you start reading online. Wading your way through the myths, theories, and hyperbole may seem like a daunting task, but the rules are simple to follow. It’s only time intensive because you actually need to work at building your profile properly, because just like when you build anything, if you make a mess of the foundation the structure will come tumbling down. When you’re taking the time to build up backlinks there are some basic questions you need to ask yourself, and once you’re satisfied with your answers you can decide if you’re going to approach a site owner to work out a link exchange. First item on the checklist, is their site (the one you’re going to approach for a backlink) relevant to my website/business. A bad idea is running around online just building as many backlinks as possible with other site owners just to have them, if they’re not relevant to what you do then at best you don’t get any help from them, at worst you could be penalized. Once you’ve decided if they’re relevant or not, start browsing their website, keep in mind good website practices as you do so. Do they have a lot of popups or funny activity on some pages? Just like you want your customers to have a great online experience, you want your link coming from a reliable source, because the web works in strange ways at times.
And that is really the bare minimums when you’re looking for a link exchange or a backlink to your site that you need to follow, are the relevant, and are they staying within the rules of the game. If you’re satisfied with your answers then you move ahead to try and work out a link with the site owner, and that would be one link down. This process could take as little as a day, to as long as a week or so, depending on the time you have to put into it and the size of the prospect you’re looking at.
The basics of building backlinks and what to look for are just as important as what you should be staying away from. For every positive and authoritative backlink you could build for yourself, you need to stay away from the places and pages which could sink you. Directory listings as an example, aren’t innately bad for your link profile, but since Penguin last year and how so many were removed from the index they’re not nearly as useful as they once were. A good link should not be the subject of an internal debate with yourself. When you see a good link you know it right away and once you start debating whether it could be considered a good link or not, it just isn’t. And last but certainly not least, does the link enhance your brand to your customers, because ultimately that’s who you’re trying to reach.