Tagged with " google"
Online marketing and branding is can be an intensely competitive market, made even more difficult with there being billions and billions of web pages out there about everything you can imagine. And while they say imitation is the sincerest form of flattery, it can tend to be a death note where the search engines are concerned.
With the web being so massive, it’s can be often difficult to say where content originated. Images get copied, text gets scraped and snippets of code gets replicated across the web on untold amounts of websites. Where organic optimization is concerned, it’s a time intensive process to prove original authorship in some cases, and even then it may not make a ton of difference. There is a difference however, where paid advertisements are concerned, such as with Adwords campaigns.
Adwords is a much different platform from organic search, the biggest being you’re paying for your positioning in the results pages. You bid for your chosen keywords, and if your ad copy and your bid are better than your competitors then your ad will appear, frequently before theirs. It’s a lucrative search market namely because it’s where people make their snap buying decisions. Sometimes, there are companies out there which play a little dirtier than others, sometimes copying ads copy directly, or even copying ad titles and format. It is a dirty business practice, and you can compare it to Pepsi mimiking a Coca Cola commercial or tune.
As dirty as it is to copy your competitors titles, copy or entire text, due to the nature of the business they may be allowed to run the ad, that is of course unless you dispute their usage. A prevalent argument that is often found in these cases falls under the Adwords informational site policy, a long winded document that exists to cover the usage of trademarked terms use in Adwords. It basically limits the use of a trademarked term to the original mark holder, or a reseller of the product. The loop hole exists however, when you get to the portion of informational sites, which can carry the trademarked text if the landing page of the ad is informative in nature to the written ad text. Now just because the loop hole exists, it doesn’t mean you’re out of luck if your competitor runs an identical ad using your text, your primary step should be to file a dispute in your Adwords account against the ad. You’re also covered in the same trademark policy text where it basically says you can’t use a trademarked term if the goal is to take sales away from the trademark holder.
Make sure to be diligent with your Adwords copy, and if you see someone using your very own text to try and snag away sales then you should be reporting them as soon as possible. If you let it slide, there’s nothing stopping you from losing your next big sale.
Recently Google went and turned on their own tool which enabled website owners to disavow selected backlinks coming to their site. Great tool, that allows a diligent site owner to take control over who links to them. The process is fairly basic as to the steps to follow, you create a text file which you upload to your webmaster tools account with the backlinks you’d like to have disavowed and voila, supposedly case closed.
It seems however, that some people aren’t content with the way the system works. After submitting his disavow list, and resubmitting a reconsideration request they were greeted with the advisory that there was still some bad links pointing to his site. The timeline with which this webmaster is unhappy with, has been a month since their initial submission of their disavowed links. There are a couple of theories about why there are still some problems, but there are also a handful of points that all webmasters who use the disavow link tool need to bear in mind.
A primary point you need to think of when using the disavow link tool, is that it is not an instant or a quick fix tool to any and all back links you might want to remove. Google has data centers all over the globe, and with that it has a number of different versions of your website at any given time. As odd as it may sound, it’s like using a collation system when working on a project through various stages of completion, so when you’re finished you can see what your steps were all the way through. Just like you could look at version 2 of your project development and have an idea of where you were, each data center will have a slightly different version of your site and it’s backlinks. It takes time for any kind of a clean up request to propagate through the entire system.
A second major point that needs minding, is you need to understand that just because you’ve submitted the disavow list, and/or asked the offending back links to be removed, it doesn’t mean it’ll happen quickly (as per the first point) or at all. The tool funtions much like asking another site owner to remove a link to your site, it’s a request, and if it happens you have no control over how quickly it does.
So finally the election is finished, and the winner has been decided. If for some reason you’ve been living in a cave the last couple of days, Obama took the crown and is set to begin his second term as the President of the United States. And regardless of who you were rooting for, there were some interesting search discoveries over the last couple of months of the battle, which have their roots in search.
A few days back, there was a story run in the Wall Street Journal about how Google was serving up results pages in what some were thinking was a strange coincidence. It seemed that even with being signed out of a Google account, and being on a cookie free browser, the results when searching for Obama almost bcame personalized. The article that was published even went on to say that the search engine was biased when searching for obama and related news, with one story coming right out and saying that the candidates were being treated unfairly. While it would make for a great conspiracy story, the unexciting truth is that it’s just how the Google algo works. Google simply displayed results based on how people searched for terms, the example being
more people searched for “Obama” followed by searches for “Iran” than the number of people who searched for “Romney” followed by “Iran.”
That was the first interesting point, the second follows in a similar vein.
It’s not really news anymore that between the candidates there were hundreds of millions of dollars spent on campaigning, but it was interesting to find that Obama out bid Romney on search ads online at nearly three to one. Both were bidding on the big hitters like ”2012 election” and “2012 presidential polls” to lead people to their campaign websites, but it was the former President who owned the paid advertisements of the results pages. Sticking in the trend of online visibility, Obama had Romney beat across the board with more Facebook fans, website visitors and Youtube video views.
The largest demographic in the voting populace is shifting to a much younger, information hungry crowd, so being able to be found online should be an integral cog in any parties agenda. When you shake all the numbers out from organic results to paid search, it looks like in the end Obama simply out optimized his opponent, and as helped secure himself with a second term.
There’s been a number of changes in the search world over the past 15 years since it’s pseudo birth, but the changes that have happened in the last 12 months have been some of the largest ever. There have been the Panda updates, the Penguin changes, and the EMD (exact match domain) changes that have made search engine optimization a much more interesting job. And not that they’re the only search engine in the game, but leave it to Google to make the most news with any change, seeing as they own the vast majority of the market.
I’ve outlined what can happen when you make a mistake and breach one of the rules set forth by the engines. You can take a rankings hit, you could suffer a penalty in the form of maybe losing some (Google) Page Rank, or you could even be completely removed from the index if you’ve accumulated enough ‘strikes’ against your website or url. As search engine optimization experts it is our job to ensure that ourselves, nor our clients fall into any of the multitude of pitfalls which you can find yourself in. None of these scenarios are unrecoverable, although making sure to get back into the good graces of the search engines will take some time and an extensive SEO skillset.
However if you don’t have time, or any search engine optimization skills under your belt, or maybe you don’t have the budget to bring in the real search experts, there is a solution for your business. It is one which will still take time, but you don’t have to worry so much about the SEO skills initially, because you’re going to start down the road of rebranding. Completely rebuilding your brand image is really a last resort option to take for your business, as it can take almost a year to return to the search results pages. If you’ve found yourself far enough up the creek that rebranding is a more viable option than repairing the mistakes you’ve made, perhaps it’s time for an evaluation of your job description.
And finally after being patient for the last few months, site owners with a Google webmaster account have the final say over how links to your site are treated. From the Google webmaster blog:
Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue.
This is going to be a great tool to add into your toolkit if you use webmaster tools directly, and if you don’t you should check that your site manager is keen on what the tool can actually do for you. A quick rundown of how links to your site affect you – you create content, and if it’s unique content that is relevant to your niche then users will generate a link to that page. These links are used as a factor when determining relevance in the results pages for the terms you may wish to rank for, and if you’ve created great content then the links will follow. More links is used as a measure of relevance, so the more the better. Well there’s a downside to links and that happens if you have too many ‘unnatural’ links pointing to your site. That would be having links from a plumbing site pointing back to your website on shoe sales, the two topics are irrelevant to each other. The recourse you had as a site owner in this instance was to contact the website that posted the link to your site and asked to have it removed, it was then out of your hands and left for them to deal with, and until it was you could be handed a stiff penalty from search engines.
The problem with that scenario is after you’ve notified the site owner to remove your link, you no longer had control of what happens next. But with the addition of the disavow tool in Google, you can now take matters into your own hands and manage the backlinks coming into your site. it’s a great step in cleaning up the web and improving the relevance of the search results overall. You can find out more about the disavow tool here.
Google has always had the spotlight when it comes to search since it revolutionized the way users access the web. It’s grown to a point where in the last year they consolidated all of the privacy clauses into one, giant blanket one that affected all of their online properties.
An example of moving forward with search, I’ve mentioned a handful of times in the blog, is the DuckDuckGo search engine. Recently the small search company produced a video where they talked about how Google has each use caught in what they called a search bubble. Where they took more than 100 users, ensured they were not signed into their Google accounts and had them conduct searches on specific terms and captured their results.
What they found, was that even when the users were not signed into their accounts, and even in the same geographical area, they received differing results pages. It’s not a revelatory video really, as Google isn’t the only company on the web that utilizes browser cookies to determine who a user is and what they may like. Not to discredit what DuckDuckGo is hinting at, but with such a small sampling, and by allowing users to use their personal computers without clearing any session cookies, it’s no wonder the results were different for each user. Perhaps with the addition of a control group, a group of 20 users or so who were using completely clean installs of a browser and OS would help balance their results.
The numbers for the past month in search came out, and while seeing Google on the top for the majority share, what was somewhat disheartening was the continuing slide of the Yahoo position.
There’s been no real shift in the overall numbers, Google is still sitting at just over 2/3 of the search share, and Bing is following up with just 15% share. Yahoo slipped even further than the previous months, to just around 12%, giving the combined search engine just shy of 30% of the market. Yahoo was one of the primary search engines and one of the first to roll out a paid search marketing platform so to see them slip further out of the limelight of relevancy. Change however, is inevitable and is always a good thing for all parties involved.
The search share numbers aren’t terribly surprising in the grand scheme of things, and perhaps it was the additons of Panda and Penguin to the equation, but the number crunchers are at it again. On the webmaster forums there is discussion going on what the current algorithm may contain and how it might use analytics to help rank the sites in the index.
Some interesting theories are coming out of the discussion, mainly because no one outside of Google really knows the process for ranking the sites within the index. Google has mentioned previously that they don’t use any search data from their Chrome browser, and the running theory so far is the idea that the search giant is using click data from ISPs. In the end it’s only the team at Google who really knows how the engine ranks it’s results.
There’s been a small surge of malware reports coming from the searches via Bing and Google, which really isn’t news in and of itself as they’re always buried within the results somewhere. But what is different, is that more than 90% of image results were found to be malware related on some terms.
The most targeted term this time around happens to be “Emma Watson”, whom McAfee has named their most dangerous celebrity search of 2012. Of the two engines, 30% of Googles searches had malware warnings attached and more than double that came out of Bings results. Malware take overs happen in a couple of different ways, one of the most frequent are websites built with little to no security written in, and then there are throw away websites and urls used purely for the spread of malware on the web.
Black hat SEOs typically go after the hottest search terms and poke around the web looking for websites which have loop holes in it’s security. They actively work to hijack the website and it’s url, to help lend false authority to what ever term they’re wanting to spam. And because uneducated or hasty users tend to automatically trust the top results in the search engines, the spread of malware will continue.
Because of the recent discoveries that image results are getting slammed with malicious results, where the text results pages are beginning to be left behind, Bing has been unofficially dubbed (currently) the most poisonous search engine. The only reason that the moniker has been attached to the search service is due to the recent report about malicious websites being targeted at image searches now as opposed to the text results pages. Not to fret however, as Bing and Google will take steps to close those holes which have been opened in the image results, and in the meantime just be a little more cautious before clicking that top image of your favorite star.
There’s merit to the web search team and the tidbits of news they put out everyday. Sometimes they talk about the changes to the algorithm, and about how to expect a shift in the search rankings or placements.
It’s not uncommon for sweeping changes to be made to the web which leaves old code irrelevant, if every little bit was always left as part of the formula, then the search algorithms wouldn’t be the few hundred factors they are now, it would be in the thousands. The search industry would be substantially slower moving, both in use and as a business. Building a website would be an absolute nightmare if you still had to worry about table construction for a half dozen different browsers, and if the flash menu you’ve built with fancy fly away modules would even start up in others. Thankfully, none of that is an issue, as old coding techniques get replaced with much more up to date methods.
One of the basics of web development, that we’ve always made sure to mention to web developers and clients, is to make sure they utilize the metas keywords and description tag. For a number of years, the search engine optimization industry has said that the tags no longer bear any relevance or importance, and that the engines themselves will more often than not choose which they like to use anyways. Putting time and effort into even writing the few lines of code it takes to put them in, is a few lines too many. Fair enough, everyone is entitled to their own opinion, I do offer an example of what can happen when you let the engines decide what to publish as your websites description in search.
SEARCH. <div> <div > </div> <div> <div templateType=”C1″> <ul role=”listbox” class=””> <li role=”menuitem”> headline </li>
Just a couple of points to note, this is from a multi-million dollar company, with strong rankings in search and in traditional media, and this is the description that was pulled to use in the results page. This mix up won’t hurt their positioning online, or offline, but it proves that skipping the basics isn’t always a good idea. Skipping adding a description tag on the premise that it’s an out dated step, would be the end for a new site online with that type of description pulled.
Canada is still really coming into it’s own online, and perhaps Google has come up with the right idea to encourage some growth. The Google eTown award is designed to recognize those towns where small businesses are investing in online tools and resources to find new customers, grow their business, and improve their operations.
Chris O’Neill, Managing Director of Google Canada
“We’re delighted to recognize the accomplishments of the small businesses embracing the web in each of these cities,” said, we know that the Internet is going to contribute massively to Canada’s economic growth..”
We’ve been saying for a few years here at Fresh, that online business is still coming into it’s own for all us Canucks, hopefully this new incentive will help spur things along. They’re not playing any favorites, and have divided the country up into 5 zones, Atlantic, Quebec, Ontario, Prairies, and BC & North. This year the 5 winners are: Moncton New Brunswick, Dorval Québec, Parry Sound Ontario, Canmore Alberta, and Duncan British Columbia.
Across each of the winning towns, business has discovered that the Internet became a vital tool for reaching new customers and engaging with existing ones. In recognition of their eTown status, Google Canada will be presenting the awards to each city during events throughout October. Local businesses will be invited to attend and meet Google experts who will be on hand to provide advice on growing your business online. Actively working on your brand image is one of the key initiatives that you need to pursue while growing online. Being able to translate your offline business into your online image is one of the goals you should be aiming for, and when you’re ready for that step, Freshtraffic is here to help.