Tagged with " internet news"
When you’re online, you can always find a survey, or a post, or a questionnaire about almost anything. Recently, the results of one such survey were announced. Conducted by the website SEOMoz.org, they conducted what they called an ‘SEO Industry Survey’ and most of the results are.. predictable, it’s a lengthy read but you can find the full report here.
First off, they collected the usual data about their respondants. Who are you, where are you, how long have you been working online etc. And as per their results, the US accounts for the largest part of the respondants, and there are slightly more female respondents from 2012 – 20.6% to 22.7%. Half of us work as an in house marketer and are 26-34 years old, obviously still a young market at heart.
When it came to the jobs we do, well it seems that the majority of us feel as though we don’t have just one job, although search engine optimization is the number one role we provide (92% response). Followed by providing analytics assistance (82%), link building and content marketing were in the same boat (71%), and coming up faster, likely due to Facebook, is social media management with 70% of respondants saying that was a role they take on.
Not that it should be any surprise, but almost all of us (93%) learn our trade online via websites, blogs, forums etc, while nearly as many, 88%, state that the hands on approach works best. After a sharp drop off to 64% of us reading an actual book, there’s another dip where only half of us attend conferences or workshops in order to learn tricks of the trade. When it comes to plying our trade, almost all search engines will admit that your content is king, so when we’re creating, sculpting, or working on a site, the following chart gives a solid representation as to where we devote our time.
An interesting patent was being discussed lately, which may be related to all of the most recent shifts in Googles index, which seem to shake up the rankings pages every few weeks.
A basic description of how the patent reads:
Rather than allow the rankings to respond immediately and directly to those changes, the system that would change rankings in unexpected, counter-intuitive ways – while the rankings change from a first position through transition positions and to the final “target rank” position.
It sounds a little strange to think that the algorithm would work in an almost backwards way, but it’s an interesting idea. It’s a way for Google to monitor situations where spam may be an issue in a websites position, whether it be through links or content. and as the last portion of the analysis describes,
significant changes in position continue to happen even though there is no change in page’s ranking factors
Happy Birthday to Fresh
It’s been five, eventful years now since Fresh has made it’s landing in Winnipeg, and we’ve steadily become the online leader in the city. We’ve seen the construction of a new museum and stadium, the return of the long lost Jets, and a steadily growing business sector.
In terms of online change and growth, the internet has brought us nearly a billion Facebook users, the introduction of an alternative with Google+ and many changes and updates to the way people access information on the web. The growth of online search, and businesses using their websites to capture their audience has been increasingly growing in Winnipeg. While it was a little bit slow in the beginning to show locals the way, we’re surely and steadily getting there with more and more businesses and people coming online each day. It’s been a great beginning Winnipeg, and we look forward to many more to come.
If you’ve felt a little over run in the last little while by the Google zoo which has been running over the index, it’s a tad sorry to report but we’re not quite out of the wild yet.
Google has mentioned that while the Penguin algorithm shift is targeted at removing/reducing the spam sites in the results, they have also let it be known that it’s still actively being adjusted and worked on. The Penguin update is acting as an adjustment (their word) to the results in removing the backlink value that spammy sites could pass on to those looking at making a quick buck or shift in their positioning. As for Panda, that was the content upgrade in case you’ve forgotten, it is still tuned for digging out poor content on sites and pages. And although the updates are coming more consistently with them being automated, the shock, and surprise that website owners were initially experiencing from positioning drops, have lessened.
Panda is a regular monthly addition now to the search index, and Penguin is being incorporated in much the same way, however it still has a ways to go. With the addition soon to your webmaster tools to handle unnatural links pointing to your website, there will undoubtedly still be some site owners experiencing a shift in rankings if being inattentive with their site. Over time however, it will be a part of the regular indexing, and the results will be cleaner for it.
Matt Cutts on the growth needed of Penguin:
If you remember, in the early days of Panda, it took several months for us to iterate on the algorithm, and the Panda impact tended to be somewhat larger (e.g. the April 2011 update incorporated new signals like sites that users block). Later on, the Panda updates had less impact over time as we stabilized the signals/algorithm and Panda moved closer to near-monthly updates. Likewise, we’re still in the early stages of Penguin where the engineers are incorporating new signals and iterating to improve the algorithm. Because of that, expect that the next few Penguin updates will take longer, incorporate additional signals, and as a result will have more noticeable impact.
Late last week Google announced an additional metric to how it will be handling search results. Starting from last Friday, Google will be taking into consideration valid DMCA requests when parsing the index. While the new portion of the algorithm hasn’t been made live, they did have this to say:
Sites with high numbers of removal notices may appear lower in our results. This ranking change should help users find legitimate, quality sources of content more easily – whether it’s a song previewed on NPR’s music website, a TV show on Hulu or new music streamed from Spotify.
There are a couple of Google owned properties which are notorious for having copyrighted content, specifically Youtube and Blogger. And while they tend to receive the lions share of DMCA requests, Google has said it’s the valid takedown requests which will be used as the metric to decide who should stay, and who should fall. It’s the next major algorithm shift in store for site owners and it’s going to be interesting to see where it takes the content of the web.
Google is taking another page from Facebooks social networking prowess, and will being allowing vanity urls to some select profiles. Currently the majority of the Google+ urls are followed with a long string of numbers denoting your profile, while some are being tidied up.
While the idea is to roll out the feature and vanity value to all of the users, currently they have only passed the cleaned up address to a few on the social network. While it’s a step in a good direction for Google+ social offering, they still have a fair amount of ground to cover in order to catch up to Facebook. A small problem has been picked out currently with the change, as the new vanity urls haven’t been forwarded with optimization in mind. The new addresses are being used as canonical urls as opposed to being a full 301 to pass the full and proper content to the search engines.
Google has been king of the search world, from almost the day it became a tool on the web. There are a handful of other search engines as well, all which do their best to offer a choice when you’re looking for online information.
There’s been some discontent with Google as a service, and it sometimes leaves users craving an alternative to the giant. There is a small problem with that idea however, and it’s the same reason that makes Google so successful. When you consider the basics of search, if you have quality content that people easily link to, you’re going to be well represented on the search engines, Google, Bing, etc. Google has just worked out how to best deliver the most likely content you’re searching for, because it can work out the content and the links leading back to that content.
It doesn’t mean though, that the web and search is due to stagnation. Everyone is working to innovate on the space, trying to find the newest, and biggest evolution in search and online interaction. There are some out there that do their best to be an answer engine, where you can basically query a database for an answer, and there are others out there which pride themselves on being hand curated by teams of human users to help promote the most relevant content. Bing and Google are both trying their hands at integrating your social life into your search results, both with mixed success at the present. But what all of these search engines become stuck and stuble upon, is the same issue, all of the current relevant results are built primarily upon links and link structures to help give value and authority to the website.
The future of search, won’t lie in constructing links back to your quality content, it will be when someone is able to come up with a search engine which can predict what it is you may be searching for. When you’re able to start looking for a new home for sale in a new city for example, and based upon your current, and previous searches it can determine that you’re in need of a new home near a school for your children, and it delivers those results to you as the most relevant. The technology doesn’t quite exist in such a way at the moment, as it would require massive amounts of calculations to hold the web open, ready to pick out the points you’re searching for. But the web and it’s technology do grow everyday, and perhaps soon enough we’ll be able to talk to our devices to find what we want.
In the middle of last year or so, Google started slowly pushing out warnings to webmasters of what they deemed as ‘unnatural links’ which were pointing to their website. Unnatural links, for lack of a better description, fall under the realm of being unrelated to your website. As an example, like a plumbing forum having links pointing to a website on cooking or gardening. Earlier this year, Google stepped up the notification significantly and almost immediately, sent the world of search engine optimization into a tizzy.
It was at that juncture, that webmasters began to start to drop links too and from their website, probably in the hopes that sending in their reconsideration request they would be able to clear the mark from their webmaster tools page. It’s an interesting process that Google has put in to place with the unnatural links notifications, some webmasters have laid evidence showing that they did nothing and plummeted in the search results. While others, who went through untold rigmarole trying to get their links cleaned up, reported no change in their positioning, despite multiple notifications.
And to muddy the waters just a little more, over the last day or so, Google has sent out another massive batch of notifications of unnatural links to webmasters everywhere. It seems that as of late, with all of the features Google has been adding to it’s webmastertools suite, they’re really looking at placing responsibility on the web owners. An interesting twist to the equation, is when you consider that search engines place at least some of the portion of their ranking factor into the links pointing to a website. Maybe, this is the beginning of Google trying to diversify their ranking algorithm and ideals? Time will tell, but giving webmasters the idea that they need to carefully maintain their link profiles is an interesting step.
Just as Hitwise measures search market share, there is a report put out by ACSI (American Customer Satisfaction Index) which tries to put a number on how happy users are with the varying search engines and social media sites out there. While there were some expected results with the survey, there was a surprise or two to be seen.
As far as search engines were concerned, it wasn’t a huge surprise to see Google still on the top of the list with an overall 82 points out of 100, and Bing picked up a little ground on them coming it at 81 points of satisfaction. When pressed for reasons about satistfaction, more than half of the respondants who chose Bing, noted that they liked the ease of use of Bing. I may be somewhat biased as I’ve always primarily used Google to do the bulk of my searching, and perhaps it’s a difference of Bing.ca versus Bing.com, but I’m not sure how Bing is easier to use over Google when both are just a search box. The links which appear after performing a search are nearly entirely alike, and it’s a rather short affair to be able to specify your results and tailor them as you like. Opinions are different for everyone, and that is the main point of a survey after all, to gather as many different ones as possible, back to the list. Plonking our way down we pass Ask.com at 80 points on the list and Yahoo at 78 points in customer satisfaction. And note, the survey wasn’t conducted about who uses which search engine, Hitwise covers that quite well and the numbers are fairly static with Google holding onto the lions share of the market. The point of the survey was about the satisfaction of using their preferred search engine, acquiring a rounded opinion would mean that after a point, the survey would have filled their quota with Google results and have been looking for Bing, Yahoo, and Ask users.
A new report that ACSI has put out however, has detailed the satisfaction level of those who use social media sites like Facebook and Google+. And again, just like the report for search engine satisfaction, it’s not about market share, it’s satisfaction so the same principle applies – to form a rounded opinion you need to have as equal amount of respondants as possible for each social media site. It was with this report, that the numbers were beginning to be surprising. The top marks in the survey actually went to Google+, with 78 points out of 100, followed by Youtube (73 points) and Pinterest (69 points). Twitter, LinkedIn and Facebook all took the bottom spots, with Facebook holding the basement spot with 61 points. With such a vastly diverse user base, it is understandable that opinions would be strong with some users about how Facebook handles itself, but there were some key reasons which came out which hurt the social media giant. The biggest issues came from the implementation of the Timeline feature, users felt there were too frequent, unnecessary changes to the user interface. Intrusive advertising came in as an issue with nearly 20% of the respondants complaining and one of the largest contributors to unsatisfaction was the privacy concerns which still dog the social giant. Nearly half of those surveyed rated Facebook a 5 or lower on a 10 point scale on how they handle privacy. Not surprisingly, the reasons Google+ excelled on the survey, happened to be the reasons Facebook tanked in comparison. On that same 10 point scale, 60% of the respondants for Google+ ranked their privacy protection as excellent with the fledgling social site. No advertising, at least not in the sense that dominates Facebook, exists on the service, and at present there aren’t any plans to add them, and a very strong mobile presense all helped Google+ to attain the top marks in satisfaction this year. There is, however, a small caveat to bear in mind with the social media results. On the whole, taking all of the social sites in hand, users are only 69 points out of 100 satisfied with social media sites, almost putting it in the basement of the study with television, newspapers and airlines.
When discussing links and linking strategies to your website, I had made mention as to the negative connotations around having poor, or unrelated backlinks pointing to your website. The watered down version of this would be, say you own a window repair business and in passing the search engines notice that you have a few hundred links from a taxidermy site pointing to your url. That’s a very quick way to get yourself in trouble, have your site scrutinized and quite possibly, dropped from the index until you have gone over all of the backlinks pointing to your site.
It’s long been an issue for the search engines in dealing with the proliferation of improper linking schemes employed by sketchy SEO practitioners. Google has their list and documentation about what they’ll do to your site should they happen to have improper links or linking strategies pointing at your site, but have they been beaten to the punch in truly dealing with the problem? Very recently one of the other guys in search, Bing, has released a way to allow users to disavow the links pointing to a website. What leads to a tad bit of confusion however is the way that Bing talks about the way improper backlinks affect your site, sometimes saying that it won’t do any harm and them sometimes saying that it could very much negatively affect your position within their rankings.
From the Bing Webmaster Blog:
Today we’re announcing the Disavow Links feature in Bing Webmaster Tools. Use the Disavow Links tool to submit page, directory, or domain URLs that may contain links to your site that seem “unnatural” or appear to be from spam or low quality sites.. There is no limit on the number of links you can disavow via this tool.
It’s a great way for you to have more control over who is pointing their content at your site, as well as the control it lets you have over your own web positioning. It’s a solid first step in being able to control your backlinks, it will be interesting to see how Bing deals with the reports which are submitted, as they have been notoriously slow to deal with changes and updates to their index.
At present, there are a couple of different sets of active penalty systems being tossed around on Google. There’s the Penguin update, and the link penalties which are being levied against sites with strange link profiles. If you’ve noticed a backtracking in your site position, how are you to know which way to proceed? Let’s have a quick look and see if it can be narrowed down some.
Firstly, the Penguin update which is still running around, seemingly causing mass havoc with some site owners. The most recent version of Penguin in the wilds of the web are searching for unnatural backlinks, a tighter field of view when looking for fresh content, better page title generation (if your page doesn’t have one), and better detection of hacked websites and pages. Note as well, that all of these algorithm changes, and much more, all run automatically as the spiders break down your site. If you’ve received no notification in your webmaster tools area, chances are you’ve been hit by a Penguin drive by.
The second current most common way that sites are being hit, is with what is called an unnatural link penalty. The key difference between this method and Penguin, is the link penalty is manually handed down against your website. If you pay heed to your webmaster tools notifications (you did remember to set that up right?), then just follow the steps in place to correct the penalties levied.
Or as I saw it put so succinctly:
Unnatural links is more about link networks, paid links, blog networks and unnatural link patterns.
Penguin is more about low quality links with weird looking anchor text, plus other over optimization related link building techniques.
In the discussion that has followed, it has been noted that the two seem to be inter-related to each other, so be sure to keep tabs on your websites performance, and don’t ignore the notifications you receive. You’ll only be sinking yourself faster should you choose to ignore any warnings or messages.
Search engine optimization, it’s the big marketing buzz word of the last 5 years or so. And what was once known as a highly technical, and relatively unknown business tactic has become a medium embraced by the masses. So well embraced in fact, that it’s become more and more populated by people who barely understand what the term means, let alone how to properly implement it on a clients site.
It’s becoming ever more obvious when we speak with prospective clients, that their first introduction to the world of SEO wasn’t all it is cracked up to be. The most common way to be taken in is usually with your webhost, offering what seems to be an amazing suite with submissions to directories and search engines. What these small, and sometimes large companies don’t seem to realize, is that directories don’t carry much influence with the search engines, and as for being submitted to them, well it’s not a process that exists in so many words. This scenario, as bleak as it may seem, is the best case scenario unfortunately.
The worst case scenario, and we’ve run into it a few times, is a client who’s been attracted to the false promises, and ‘darker’ side of SEO. The black hat entrpreneurs, if you can call them that, lure in their clients with promises of page 1, top 10 positions and quick return on their investment. The problem here, is the tactics that employed often destroy the online reputation of the company, and lead to the website often being removed from the index. When we’re engaged with a client in such a predicament, we’ve actually had them start over entirely, new url, new website, the works.
Key points to remember about legitimate search engine optimization are: It is not a one shot deal that will place you in the top rankings
It is a long term, high ROI solution
It is the highest ROI marketing solution when the costs and gains are weighed
And a real SEO expert will engage you as a client, to help you create content, and create an online experience to help bring qualified visitors consistently to your site.
Don’t be fooled by get rich quick schemes where SEO is concerned, it doesn’t work, it doesn’t exist and anyone who is trying to sell it to you should be black listed in your contact book.