The world of search engine optimization is like a large and fairly confusing maze. And while every business can benefit from improving their ranking on search engines like Google and Bing, as soon as you start looking for help, you begin to get email after email and call after call from SEO providers promising you the top position on the results pages. The problem is, how do you tell which SEO firm is the best one for you, and how do you begin to even decide which one is right?
Check the SEO on their own website:
When you search for their company name, are they at least the first result that comes up? Are there other pages that come up with them when you look for their name? Do they have strong meta information that helps to sell themselves and their services, just because they don’t carry any real authority with the engines it still shows as their snippet. Do they rank locally? Ideally you’re in talks with a local agency, but if they can’t show up for their business name or for their location, how could they possibly help you and your business?
What is their cost:
If they’re offering discount packages for their “services” then they are likely not the company that will work in your best interest. A true SEO company will be doing an in-depth investigation before they offer you a quote, every company has different markets, competition, and goals. Imagine a house painter quoting you a price without ever looking at your home – you can bet they’ll either do a shoddy job, won’t finish, or suddenly have all sorts of hidden fees.
What kind of relationship do they have with their clients:
Your goal is to build a relationship with a company that is going to be in constant contact with you for the entire scope of the project. They should be calling/emailing trying to understand your niche market, your online targets, and developing your conversion points on your website. Make sure they will report to you what they’re doing and the results, and in such a way that you can understand. You’re trusting your reputation with another company, make sure they have your best interest in mind, not just their bottom line.
Due to the way that the search engines deliver their information to users there has been a standing debate about who is actually responsible for those results. Some say it’s the search engines themselves that control the results pages and the response from Google and Bing for example is that they don’t control the results, they merely display them.
Last month the EU put forth a rule that everyone has the right to be forgotten, a method which users can submit to have urls removed from the results pages that they feel are unacceptable. It opened the doorway to the SERPs being hand curated by it’s users and the end goal being the removal of defamatory information from specific searches. It’s not something that you can just request willy nilly, you need to be either the person who is directly affected by the term, or be the authorized representative of said person.
The form requires submission of a photo ID of the individual the request is for. So even if a third-party is doing the submission for someone else, they need that person’s photo ID as a way to prove they have some type of approval by them. This implementation is only available currently in the EU however, and as of just a couple of days it was unknown if the trend would follow over to the US side of Google. As it turns out, the right to be forgotten form will remain an EU only feature of search, and it doesn’t completely remove web addresses from the index it merely removes them from the EU results pages.
But as the saying is for every door that closes another one opens, it seems that Bing as they were affected by the decision as well, has decided to try and work the system in across the board. At last count Google was taking in around 10,000 requests per day from the form process, so it’s clearly going to take some time for the SERPs to reflect all of the proposed and accepted changes, there hasn’t been any mention how Bing is faring in the requests department.
When you’ve finally gotten your website online there are a million steps that you need to take in order to be ranked at the top of your niche market on the results pages. Instead of trying to explain each point, we’ll take a different tack this time around, how about a list of things of what not to do on your website.
If you’re not managing your site yourself, hopefully the person or agency you have contracted is on the ball and has a clue about how not to run afoul of the rules. If your site gets hit by a spam penalty, whether by the algorithm automatically or if you’ve been flagged manually, it isn’t the end of the world it can be fixed. But let’s get started so you can have a cheat sheet for yourself to check on your agencies efforts where your website is concerned. A note just before we get into things, these are not hard and fast rules, the internet isn’t even remotely a black and white entity, so take everything you read below with a grain of salt.
Misspelling words is an every day thing, everyone does it billions of times per day. But one way that you can run on the wrong side of the web spam team is if you happen to register a domain name with a misspelled version of a highly notable brand name in your niche with the idea to try and generate traffic off of the misspelled term. This is a good example of the web not being black and white, anyone can register any domain name so long as it’s available – but that doesn’t mean that the search engines don’t have a say in where it’ll place it in the results pages.
Having a meta refresh in your homepage, effectively locking visitors into your website by messing with their browser control. It’s not uncommon that when you arrive at the wrong website you hit the back button or the backspace key to return to the page you were at. But using a method like a meta refresh in the header of your websites home page removes that option to a visitor to your site. The basic sequence of events with this type of refresh is when a visitor lands on your page, it refreshes itself a time or two so that when they press the back button, they don’t actually leave the site. Instead they’ve just refreshed the page again and they’re back where they don’t want to be. It’s a frustrating experience in general for users, and a no-no with the search engines.
Having your website encoded entirely in Flash, Java and even some versions of Ajax or Silverlight which require specific browser plugins to function correctly. While this isn’t a negative with the search engines specifically, using entirely visual only coding effectively hides your website from the search engines. Being that Google, Bing and other engines look for text on a website, the text on a Flash and even sometimes Java scripts isn’t readable by them so they assume it’s a blank page. They are getting better at digging the text out, but they’re not all the way there yet so keep that in mind when a designer approaches you with a flashy visual display that has no real text elements. Along the same line of thinking but this time where users are concerned, more and more people are accessing the web with tablets and phones. iPads and iPhones take up a sizable share of the mobile marketplace and they can not display any Flash and some Java, your site would literally be invisible and unusable to an iPad user if you had an entirely Flash built website.
It’s been a busy week working on the web with all of the heart attacks going on across the web. In case you were living under a rock for the last 5 days I’ll go through a quick recap for you.
The issue that popped up this weekend affected a absolutely massive portion of the web, some reports saying as much as 65% of all websites had the potential to be affected. When you’re talking about billions of websites and trillions of pages, it’s a huge amount of the web. Just to be clear about the issue, the Heartbeat bug affected sites that were using a specific security certificate from OpenSSL – the community driven option to paying for a security certificate for your website.
Without going into too much technical jargon and being confusing, the best description I found regarding the bug was this description of events.
The top portion of the exchange is how a secure connection works, it’s a very simplified version of events between your computer and the webserver you’ve connected to. The Heartbleed version of events that comprises the bottom portion of the image is where the exploit got it’s name. The process is the same, but via what’s called an overflow error a malicious user can request a longer string of information back related to your security code, called an overflow error.
The issue was found, corrected and there are multiple steps you can take if you feel that your personal web security was in question. CNet has a running list of sites which have been patched against the Heartbleed bug and if you should potentially change your passwords on those services. Have a look through their list and follow the proposed directions to minimize any potential security issues you may have in the future.
image credit : vox.com
There is to this day a general misunderstanding about search engine optimization and just what it can do for your website and business; SEO will not sell your product for you.
What optimizing for search does do however is give you visibility online, a very important component of online sales to be sure but it’s only one side of the coin. For the sake of explaining assumptions will be made – seeing as how you likely have your own website for your business it would be somewhat safe to say that there is some experience selling yourself or your wares to your intended audience. When you’re working on a sale for yourself a solid general rule to follow would be around 1 in 10 or so, for every 10 contacts that you make you’ll earn a sale – it may seem low but this is from a strictly hard sell stand point. From that stand point the most difficult part of making that sale isn’t actually the conversation with the customer, it’s generating that initial point of contact. The days of people wandering down the sidewalk and walking into a store front that intrigues them are dwindling, increasingly often consumers are turning to the internet to procure their desired goods.
If you already have a website then a good 30% of the work is done already, you have the potential to turn that previous hard sell approach into a soft sell, qualified visitors to your site are there because they want what you have. That’s where SEO, aka internet marketing can help turn a paltry 10-20 visitors a month into hundreds, if not thousands if your market is big enough. What we can bring your business and website as SEO professionals is visibility, you are looking for the aforementioned qualified consumers – whether you want a sale, a sign up, or an contact me later email, search engine optimization can help make that happen.
What we can not do however, is actually force that sale for you and your website. Every now and then during a campaign there is a tipping point where we sit down with our clients and essentially have the following conversation. Now that we’ve addressed your technical and optimization issues, it’s time to talk about your conversion points and methods. What makes that conversation frustrating is when the advice is ignored or discounted because now that you have all that visibility and traffic your sales will go up the same amount, right?
Have you ever stopped to consider why your website may not be performing quite as well as it used too? It is always worth it to stop and have a close look at what your offering as an online presence to the public, because sometimes a face lift is in order.
Online marketing and branding is still a rather new avenue of growth for every business out there, and it’s one that needs to be monitored and measured appropriately to make sure you’re getting the most you possibly can out of it. Every now and then we have a client come to us with their woes of poor online performance and when we look at their website it’s like looking through a time warp. An outdated appearance on a website can be detrimental to an otherwise successful business offline. Tech is always improving and we’re a long way from using tables and basic HTML scripting to design and build websites, having a recognizable and intuitive website design is a significant part of a successful online presence.
Businesses are always growing and changing and sometimes your old mission statement and goal doesn’t match your current model. It doesn’t mean that you need to completely redesign and develop your website, but it can always help to revisit your content and your vision to make sure that your websites message matches that of your vision. Also to keep in mind is just how does your website react when you visit it with a smart phone or a tablet? If your site isn’t at least somewhat responsive you’re only losing out on providing your visitors with both their desired and required experience. If your site is difficult to use and navigate then your visitors are likely to leave in favor of finding someone else to provide them with their needs.
As strange as it might seem, we’ve been approached by some people looking for help online and found that while they have an incredible site, and a great message and content they’re just not meeting their conversion goals. There are usually only a couple of reasons that we can boil this down too, one of them being that the conversion message, or call to action is lost in the complexity of the site. Keep your message and conversion message simple and you’re more likely to end up with that coveted sale. Additionally we have even had some site owners come to us and have found that their designer of their site neglected to allow indexing of their site via an htaccess or robots.txt file. There is always time for you to reevaluate your website it’s content and it’s call to action, and always make sure that if you need assistance with any of your online issues to make sure to call the experts here at Freshtraffic.
The internet is a pretty big place and with Facebook throwing its hat in the search ring with their trillion of connections made, it shouldn’t surprise anyone if a search engine doesn’t immediately deliver exactly what you’re looking for with your first search.
Google is often placed under the microscope when complaints about the web or search quality come up, but it seems exceedingly rare that anyone actually talks about how big a job it is to be a search engine. Using Facebooks example of having an index of a trillion connections made using their social software alone, it should be clear that the web is a huge place. An estimate of the size of the internet is somewhere over 100 trillion web pages and users and complainers are often quick to pass judgement on the search engines when they couldn’t find what they want. Google is the largest and most widely used search engine on the web, still holding onto more than 2/3 of the audience out there and even they don’t even try to get close to curating that massive amount of pages.
When you factor in that many pages on the web and an algorithm that sorts, ranks and tries to properly place every one that it crawls <em>and</em> that it can deliver your results pages in less than a half second it should really be amazing that it can be done at all. Constant updates and improvements to the algorithm that does the bulk of the work can alter the pages you see when you search, and even sometimes appears to completely break the results pages as was the impression when Panda and Penguin were integrated into the algorithm. As an exercise in just how massive an undertaking this can be, and how Google and the other search engines aren’t out to get you and your site specifically give this a go. Imagine you have 100 pennies in your possession all with a different year on them, after shaking them all up in a can pick out the one with the year of your birth on it, if you don’t pick out your year it goes back into the can. You might get it in the first few or it may take you 30 – 40 tries, now repeat that experiment 100,000,000 more times and you’ll have a sample of how much work the search algorithms do every time they perform your search.
Shots have been fired across the bow of the Google command ship and they came from a source that is not only extremely early, but somewhat unlikely – Facebook.
Just a short time ago Mark Zuckerberg came forward during the most recent Facebook earnings call to state in no uncertain terms that they will compete directly with Google to be the kings of search. Over the last year Facebook implemented their graph search which allows you to perform contextual searches based on your friends and what ever information they have shared with you, to hopefully find answers to what your question may be. They’re working on being more of a source of answers instead of a source of results all while targeting the mobile platform in order to facilitate mobile searches.
The Facebook graph search has by their records the largest index on hand, larger than any other web search engine. They estimate they have somewhere more than a trillion <em>connections</em> between their users, interests, groups etc. While the number sounds impressive to be sure, and while Zuckerberg believes that they happen to have the largest database on hand the proof will be found in the pudding as they say. The actual size of the index that Google has is difficult at best to try even try and envision as a number let alone an actual one, but the last count that seems to be passed around is somewhere in the neighborhood of 50 billion pages, growing at a rate of 5+ billion pages per year based on people creating, modifying and changing their web presence.
Where the house of cards that Facebook has built for themselves as an opponent in a giant versus giant battle is also tied to their earnings call unfortunately. The likely timeline that Facebook could pose a realistic threat as a web search engine is in 10 years. 10 years on the web is an eternity where technology is concerned, and at the rate that Google and other search engines are growing and adapting, Facebook is likely to be left out in the cold when the time comes to fight.
There is always a someone talking about how SEO is a dead industry, and more often than not the doomsayers used a very specific type of optimization methods.
When the online marketing game started it was a fairly simple matter to get almost any website listed. You didn’t even really need to have any content of merit or even any kind of following to your website. You didn’t even need to have an okay website never mind a high quality one and as for any kind of best practice guide it didn’t really exist in the beginning. There were no pure white hat methods, although there were many black hat methods and it took a while before the search engines even began to lay penalties to some of the worst offenders. This all started with real gusto across the web in the mid to late 90s.
As the web grew and expanded and as the search engine bots, crawlers and tech got better, the types of things that you should do and shouldn’t do began to become clearer. After a few years of clean up, the search engines and their algorithms fell almost into a routine. You could build a site, create or scrape some content, point any kind of a backlink at it and make a site start to show up in the results pages. It was at this point that the terms ‘search engine optimization’ really started to become widespread and the notion that you could make money from SEO started to become an avenue for people who frequented blogs and discussion forums about the quickest and easiest way to make a dollar online. This was in around 2005-2010 era of SEO, when the industry became suddenly inundated with experts in the field. It really shouldn’t be that much of a surprise, that these are the same folk who are calling SEO a dead industry these days.
In the last few years SEO has had some major shifts with the algorithm much the same as the industry saw in 2003 with the Florida update which cleaned up a great deal of the spam across the web. Penguin and Panda were the most recent additions to the Google algorithm which changed the world of SEO enough that the prior blogged about methods of spammy content and tons of anchor text and backlinks disappeared as a viable strategy. They were very simple methods, easy to implement and even easier to spam multiple sites to help drive a target to the top of the results pages. But since the means and the methods became unusable as a reliable way to rank a site, it is suddenly the end of SEO as a viable means of marketing. So the next time you’re approached by an agency who tells you that ‘SEO is dead’ take a moment and remember that the industry is far from dead – if anything it’s growing. It’s only the that the wheat has finally been separated from the chaff.
With the explosive growth of the web and the rapid pace of business development online it managed to catch a lot of the older, more established industries with their proverbial pants down. And every now and then, one of them tries to make a change to catch up to the pace of the web.
The Wall Street Journal introduced one of the first methods of pay for news services by an online newspaper, they saw the coming of the storm and instituted the first known version in ’97. They started off slowly, but in less than ten years they had garnered more than a million readers and have been going strong ever since. Had more ‘old media’ agencies like radio and newspaper followed their example then it’s likely they wouldn’t complain about the loss of consumers as they head to the web to get more of the news that they want.
The WSJ did the right thing for them when it was needed in order to not only survive the online marketplace, but to thrive as well.
But every now and then, there is a surprise and a business does something completely unexpected, and launches a business idea that is completely out of their scope of services that it’s startling to see; like a newspaper suddenly offering website design and development services. Now to be fair, that’s not entirely out of their range of business as they do have an advertising department and offering designers up for websites isn’t out of their realm of possibility. They already create ads to run in their papers and flyers, so it’s not entirely foreign that they would be able to help out for businesses that might need a website. When they start offering up search engine optimization services though, now we’re talking about leaving their realm of expertise completely. There is a very specific set of skills that is required in order to be able to properly work an SEO campaign, and the odds that a newspaper can meet those needs is slim at best. Horses for courses as they say, and the last time I looked a print newspaper is almost the exact opposite of the online market.