Tag Archives: google

Common Sense Optimization

There are a good deal of different steps involved in properly optimizing a website. On site factors and changes, the time it takes to build a proper backlink profile, and making sure your website is properly built is a weighty deciding factor with the search engines.

gprOn page, on site optimization is a time intensive project. Your content needs to be broken down, weighed and evaluated against your competitors as well as against the search engine guidelines. Your entire website needs to be taken down to its base components, the text, images and navigation all need to be optimized to ensure you have the best chance to rank your site. Some of the larger concerns you have with your on site work are avoiding being spammy with your content, avoiding using any tricky pieces of code to hide text or images – even if these are done by accident they have very real consequences for your website. Avoiding on site optimization issues are basic affairs however, and you can use your gut for the most part to avoid them.

All of the on page work that is done, and for all of the time that it takes to do properly, the off site optimization is just as intensive. There is no real handbook on how to properly perform off site optimization, but there are a couple of key points you can keep an eye on. While you have no real control over who, or what links to your website initially, you do have the ability to manage those back links pointing to your website via Google Webmaster tools. There is thankfully some reprieve where your back link profile is concerned, as the search engines are pretty good at picking out which websites aren’t on the up and up. One of the largest concerns that website owners seem to get worked up about are having links from websites that are obviously selling links to anyone that will pay. Well thankfully, the short story is that it’s highly likely that Google knows they’re not playing by the rules and any links that may be pointing at your site won’t be a problem for you. Matt Cutts, Google’s head of spam gave a brief answer to this concern, the main points to keep in mind:

Websites that sell links will see their toolbar PageRank downgraded by 30, 40 or 50%.

The site will no longer continue to be able to pass PageRank.

Sites they link to will no longer benefit from those links.

Save Money On Google Penguin

In the last year the Google has released some pretty heavy duty algorithms to try and clean up their search results pages, affectionately referred to as Penguin and Panda updates. It is usually fairly easy to pick up on when an update is going on, as the results pages shift fairly consistently over a few days, and during that time you can go from a top 3 position, to not found, to page 2, and so on.

SadPenguinThere are people in the blogging world who have some pretty close ties to the team at Google, and they were able to send off a quick email to find out if the gut feeling about an update was correct or not. Just last week however, it was announced that there would be no more direct confirmations from the search engine, about updates or shifts that are occurring. But don’t worry, it’s not that the communication is completely breaking down, it actually has more to do with the fact that the shifts shouldn’t be as abrupt or drastic anymore, perhaps it’s a sign that the results pages are getting closer to what the team at Google deems acceptable?

Of all of the things that we hate the most with the SEO business, there is always a short list of the few things that we try our hardest to stay away from. Some are unavoidable with client assistance, and some are completely unavoidable and regardless of how frustrating an experience it can be, we have to deal with it. As it goes with any endeavor, one of the biggest issues that can pull you apart is having too many cooks in the kitchen. There are all sorts of cliches you can use to this effect, but they all mean the same thing – too many people on the same project, and loss of information and procedure will occur. We try to minimize the impact of this by being able to directly work on a clients website, but there are sometimes cases where that isn’t a possibility, and so we deal with it. To a web developer two days to make a change may seem like an acceptable time frame, but to the internet and the search engines, it means all the difference in the world.

Another issue that sometimes crops up, and we sometimes receive phone calls for is for a quick solution, to a long running problem. Search engine optimization is not an apply duct tape here type of process, the minimum time frame we try to advise our clients of, both existing and prospective, is a 4 – 6 months window within which you might start to see consistent improvement. So the next time you’re considering using SEO to help bolster your site positions, just remember that if anyone tells you I can do it in two weeks for $300, you are potentially digging yourself a very deep hole at worst, at best you have spent money for no real lasting results.

The Dark Future of Facebook Search

In the last few months Facebook has come out in the open about their own search offering, and if you are interested in trying it you can sign up for it. It’s an intriguing idea Graph search, but as numerous blogs and articles on the web quickly discovered, the results which are returned can be a little on the flaky side at times. You can even go so far as to somewhat toy with the search interface, and come up with some very unusual search settings as an example.

The service is still in its infancy, it has a lot of learning and lot of growing to do. One of the main complaints that has come up however, that has been consistent across all articles is the web search feature provided by Graph search is lacking. In fact, it’s lacking enough that it may as well be non-existent, so there were writers out there who had hoped that the service would improve over time. It seems that their prayers will go unanswered, for the time being at least, as Grady Burnett, VP of Global Marketing for Facebook said, in no uncertain terms there will be no external search engine. The actual quote from SMX West:

GB: I don’t see that happening. We called it “Graph Search” because we’re focused on letting people search the Facebook graph. So my answer would be no.

internetmapThere is going to be a handful of different responses to this message from the company, some will be cheering, others will be jeering of course, but those in the search industry who truly understand, won’t be surprised at all. When you consider all aspects of the internet, not just search and social, a picture will begin to form. This map obviously isn’t an exact replication of what the internet looks like, or how it’s divided, but it makes it easier to understand, and see why Zuckerberg, who built the largest social network in existence, isn’t worried about external search at the moment. There is so much more out there that isn’t Google or Bing or Yahoo to worry about, they’re only a fraction of what makes the web so massive.

Ever Lasting Search Battles

Search engines catch a lot of flack for various things, some of the items which top the list are privacy concerns, spam issues, and irrelevant search results. The laundry list of complaints and concerns that users come up with about the engines is as varied as the user, and can sometimes border on the silly; like the recent court case Google won.

WebsitegoalsAll things being even though, the complaints do have some merit as there are lingering issues with search and the results pages. There are some unscrupulous users out there that try and ruin it for the rest of us by gaming the search engines. Cloaking pages, hidden links, scraping genuinely great content and sometimes even just running a content generator that spits out nonsense. At a recent event, Matt Cutts came out and discussed some of the top issues that Google actively works on keeping from their search results. Some of the ones that you would expect are in the list, spam content that was auto generated, this usually reads like nonsense and is stuff to the brim with keywords. Not surprisingly, keyword stuffing also made that short list of immediate no-no’s, where SEO is concerned this is even a basic of the basics, you just don’t do it. As it should be expected, pages that are full of nonsense text for no reason at all, it isn’t too difficult to find unfortunately, as you can just run a search for “Lorem ipsum” and you’ll be able to find too many poor businesses out there who have no idea they’re being swindled.

One of the methods that was discussed which is still an issue online, is full on hacking of websites. It started when the web was born, and will likely always exist in some form or another. There are many different levels of hacking, and not all of them are malicious, web developers have been having to deal with Internet Explorer for years with some of these methods. But there are still malicious hackers out there, who will attempt to seize your website and use it to their advantage. You can deter most attempts however just by following safe web practices and ensuring your website software is up to date, and by taking the time to actually look at your own website from time to time. If something looks wrong, investigate and fix it before it becomes an issue.

Search engines and the internet are still very young technologies if you really break down the numbers, they have a lot of evolving to go through in their lifetime. There is clearly some massive potential for growth in all sectors, search, websites, and information sharing, etc, and for as many innovations we make there will be users who try to take advantage. Be aware of your website and it’s content, and try to keep in touch with your website admin and your SEO provider, you’ll be thankful you did.

Search Context Matters

A quick pop quiz for you, what do these terms all have in common: cheap, cost effective, reduced cost, low price, reduced price? If your first response is that they all are basically the same thing, then I could say that you’re correct. Wouldn’t you know it however, that the search engines, and the internet, don’t see things quite the same way?
synSearch engines like Bing, Google and Yahoo are great at the basics of figuring out what it is that you’re trying to find online. Using the above terms as an example, if you searched for a “cheap washing machine” you would expect to get ads for refurbished machines, maybe some Kajiji ads or even Craigslist offers. The problem with the way that search engines determine what you’re looking for though, really becomes apparent when you search for “low price washing machines”. They are the same terms, and mean the same to a person, but to a search engine bot they’re completely separate values, you could just as well be searching for washing machines in one instance and a space shuttle the next.

The bearing this has on you, as a website owner and online storefront, is you need to be clear in your message you present, and your website needs to support your message. If you are in the business of repairing and reselling washing machines, then you need to be clear that yours are both cheap, and low price. Search engines, for as amazing as they are for what they do, has no idea of context, and as a result you need to relay that information to them. You do this both with your content, and with your optimization efforts. When you’re ready to finally be known for all of your business services, the online branding experts are here to help.

Changing The Search Game

With the way that search is always changing and evolving, you would have to think that someone, somewhere is going to hit on that perfect search machine. One that combines social signals, with personal preferences, some local results thrown in and to top it all off, be completely unbiased. While the likely hood of that happening in the near future isn’t bound to happen, it’s not impossible.

leaderThe first thing that needs to happen for that to become a reality however, is the entire web needs to be free to be indexed. That means forums, social sites, business pages, and any other site which holds any information or service for web use. So the first step would be everyone playing nice, and getting along, instead of locking away portions of websites from the information gathering devices, whether they be spiders or some new type of bot. After it has finally been able to find it’s way around the web and build up new version of a searchable index, then context can be used to create a search process. The real problem with this step is the creation of a new type of page index. There are a half dozen different types of search services out there, and everyone seems to have their own way of doing things. Currently it isn’t any real secret that Google is the king of the castle, but despite their prominence, the potential of them losing their spot still exists.

Using the premise that a new type of index does exist, and a new search technology exists to take advantage of it, this is the point where personal preferences take over. It is this point in search where everyone is unique, and for as different as we all are we still expect a familiar experience. But as an example, with the way that the web currently works, if you perform the same search at home and then on a computer that isn’t yours, you will get similar, although different results. This disparity is what will likely be the game changer for search, if you can receive consistent results, regardless of device, is when the next search king will be crowned. The solution is likely a cloud base type, where your preferences are stored virtually as opposed to locally, as well as the browser not being a program you install on your computer, rather one you just, access. The closest a company has gotten so far to deliver a product in this way, is Google at present with their Chromebook product. And while not terribly surprising that they’re the first to venture into a wholly cloud based product, it would be exciting to see others making the same steps.

Manual Penalties Can Be Recovered From

So a little while back there was a major site, Interflora which effectively been kicked from the search engines for breaking the search engine rules and passing Page Rank via paid advertorials. That was a couple of weeks back, and they were completely removed from the results pages, now it seems that they’re back in position however. When so many are claiming foul and being wronged by the search engines, is it really just that easy to bounce back?

breakthroughGoogle confronted the company on links that were the paid advertisements, as well as linking setups that they had labeled as toxic or suspicious. The number was so high and so evident, some sources saying it was as high as 70% of the links were toxic, that it the manual penalization team couldn’t miss it. Since the company had fallen out of grace with Google, they had a long row to hoe and a relatively short time to do it, as one of the biggest flower giving days the world over is coming up, Mothers day. Due to the circumstances of how they were handled within Google, how they went about dealing with their mistakes, and leveraging the tools available to them within Webmaster tools, their rebound begins to make more and more sense.

Ironically, one of the reasons that they were able to get back into position quickly was due to Google pouring all over their site and their links as they were clearly being naughty previously. They began that painstaking process of of cleaning out their poor backlinks and disavowing using the disavow tools in their Webmaster tools account. It’s a long, and arduous process, but by getting lots of people on the job they cleaned out their entire linking profile and stopped handing out Page Rank to various places on the web. There has been some speculation that properly recrawling every single page and link that was previously tied to the company should have taken months, and with the disavow tool still relatively new and uncertain of it’s inner workings, a couple of theories have cropped up regarding their speedy return. One is that Google manually took care of the process, which is possible seeing as they were well aware of what the company was doing. And the other contender as a possibility is that when the penalty that was leveraged against the site, was lifted even after only a small number of the links were crawled, kind of like a forgiveness nod for cleaning up a mess you created.

Where we are in the end is the company is back in the search pages, it’s linking profile, while not completely fixed, is noticeably better, and there are more questions unanswered rather than answered. It’s likely that due to their size as a business, that Google kept a very close eye on them and are being quite lenient with any remaining links that they may have. The sullen side of the web though is crying foul and chanting that the larger sites on the web get special considerations while the little guy wallows in the depths of the web. Only Google really knows what happened in the end, but regardless of why their return was so quick, it was a great litmus test of the disavow tool, manual reconsideration and search reinclusion requests.

Bing Needs to Shape Up

When you’re doing any kind of advertising and marketing, you need to eventually work out the numbers and decide whether or not it has been a worthwhile investment. Thankfully, that time frame for Microsoft has been a scant 4 months, during which they spent who knows how much money on their largely failed “Scroogled” advertising campaign.

It wasn’t pushed terribly hard over all advertising channels, but occasionally you would catch one of their ads, whether it was print, television or online. And the general premise was “Google isn’t playing by the rules, so come and use Bing!”. In a completely unsurprising event, the internet didn’t really notice that MS was stomping their feet and throwing a tantrum, except to maybe pat them on the head occasionally and have a chuckle at some of the videos they made. A link to my absolute favorite one of the handful I saw:

I’m guessing by the way they scripted the ad, Bing would have told you that if you use a pan on too high a heat you would start a fire?? Also, it’s somewhat cringe worthy that the way Bing has decided to upload their videos was to use Youtube, a wholly owned Google web property.

With their Scroogled campaign Microsoft was aiming to make it appear as if Google was infringing on every possible piece of private information, and while Google did start serving ads in Gmail over the past year, most Gmail users have reported that the ads aren’t an issue for them. And Microsofts new webmail service that has been relaunched from Live mail to Outlook.com now, even had the same type of ad service running once you’ve signed in. At least someone in the Bing world decided to actually watch one of the ads or read some of the print they put out, because their content is silly at best, an insult to general intelligence at the worst, and cancelled the entire campaign. Who knows what’s next on the Bing advertising plate for taking a crack at Googles share of the web, they’ve tried positive advertising, and negative attack ads, maybe some day someone will actually decide to take a look at their search tech and make some upgrades there, here’s hoping!

What Drives You

social-marketingWhile Google is undoubtedly the largest search engine on the web with its trillion pages indexed, they are not the only tool out there with which to make your way around the web. But while there are hundreds of millions of web users out there, there is only a handful of search engines that really garner any real use.

Google, as mentioned previously clearly holds the dominant spot online and has for a number of years. With more than 2/3rds of the market share in search, it has an massive presence on the web. With the clout that they have with the worldwide market any business that has a website is keen to try and make a place for themselves on the front page. And the bigger the target, the more detractors one is bound to have, and Google definitely has the majority share. Privacy issues, a social platform that (at first) floundered and has grown somewhat stale, and a long list of competitors claiming anti-competitive behaviour it seems amazing that they could still be in business, but while they haven’t made friends with every user on the web, 66% is more than enough.

The second most widely used search engine is really two, as it delivers results for both bots, Bing and Yahoo gobble up the majority of the remaining search activity. The Yahoo results pages for more than a year so far have been provided by a Bing search bot, as opposed to running their own bot, and building their own index of websites on the web. And while this still allows the adopters of the Yahoo portal a way to browse the web, they’re not being delivered their own true results. The new CEO at Yahoo however, seeks to change all of that, hopefully 2013 has some shaking up in the search world. Bing as a search service has been trying hard for a couple of years to break into the market that Google dominates. With some clever ideas with image search, flyout snippets of search pages and sometimes widely differing results at times from Google, Bing has a share of the market that hasn’t shifted much in a number of years. Perhaps they can rekindle their search agreement with Facebook and together they can develop a full fledged social search service, only time will tell.

In the last little bit of the search world, you have some of the little guys who are trying to shake up the web. Blekko, one of the more interesting search services out there is a great way to pick your way through a search results page that bills itself as being spam free. Your experiences will vary wildly based on what, and how you search, but with their usage of what they describe as slashtag which allows you to greatly fine tune your search parameters. It’s an interesting technology and definitely gives a differing view of the web and it’s offerings. Another small fry in the search landscape, but one which can cater to those concerned with privacy is Duckduckgo. It has the same clean search ui as the others with a basic text input box, but it delivers you results from “outside the search bubble” they describe that other search engines put you in. It is a great option to have a look at what the web might look like with no search history to go on, the results can be interesting to say the least.

Facebooks Fail Graph Search

fbIt hasn’t been new news for a while now, but the Facebook Graph search feature that is being tried and tested is slowly making it’s way to a live feature available to all. The massive social sharing site which has more than 900 million members has an unimaginably large data set to pull answers from, and allows you to search the interests, location, and preferences of your friends list. At it’s current state, it is the tail end of that statement which holds the most important piece of information – preferences and interests of your friends.

The implementation of Graph search is not a bad idea on paper, or in practice, it does have a long way to go however where you’re really searching for an answer. The best way to describe the service and what it offers was summed up here

For anyone who uses the Internet to search restaurant recommendations, travel advice, books to read on vacation, or which political candidates to vote for, Facebook may have replaced Google as the best search engine.

The veracity of the end of that statement is questionable at best, as Facebook Graph isn’t so much a search engine, as it delivers you a report of your friends opinions. The bonus is you can compile the information quickly, and in an easy to digest fashion that you can use to reach a decision on what you searched for.

The downsides however, have been slowly been coming more and more to light as more people are being allowed to use the service. For example, really searching for a person or topic, doesn’t happen with Facebook Graph at the moment, on the surface it seems that Facebook is using it’s algorithm to scrape statuses, updates and likes. The downside to that being, if you haven’t liked a page, commented on it or had a status update with the term in it, it is highly likely that you won’t show up for some of your interests within their search provider. I’ve not had the chance myself to try the service as it is in beta testing in the US only at the moment, but taking a snippet of information from other sources, it seems they have other issues as well. The image search doesn’t work as well as it potentially could due to most images not having a geo tag associated with them. The Facebook version of instant search goes a bit over the top by putting in elements of auto complete as well, by trying to anticipate what you’re looking for.

Facebook has an immense amount of data and power at it’s fingertips with their user base, but it isn’t a surprise to see them stumbling along in an area they are not suited for, search. It may be a strange thing to say, but I hope they improve and I hope they find a way to truly integrate the web into their service, Google is an incredibly powerful tool and everyone does just that much better when there is some real competition. Here’s hoping Facebook doesn’t drop the ball with Graph search, and the overall improvement of the web.