Tagged with " google"
There’s been a number of news worthy topics which occured today, one of the biggest in the search sphere would have had to be the glitch with the Google Webmaster tools. It created a bit of drama, thankfully it has been addressed so it’s not an issue anymore. There’s Larry Page who is sitting down with the FTC, extending the cycle of litigation against the search engine that it is biased in it how it displays search results, favoring it’s own products over others. The other tid bit of news which caught my eye was around DuckDuckGo, the crawl frequency it has and how it seems like it runs on it’s own set of rules.
The only real issue between Google and the FTC is that they really don’t want to be negotiating at this point. With Page sitting down with the FTC over the antitrust talks, there doesn’t seem to be any common ground where the two are even attempting to meet. The FTC won’t give in unless Google allows them enforcement authority over the results it serves, and they’re not very likely to be giving that control up anytime soon. The disappointing part is that it is likely that neither party wants the case to go to litigation, as it’s just going to increase the time it takes to make any kind of progression on the claims by fairsearch.org who believes that Google is guilty of search results bias and serves it’s own web properties over others. Soon enough, someone will have to buckle somewhere, it is just a waiting game at this point to see who it is.
As for all of the drama surrounding the Webmaster Tools accounts with Google? Well someone must have plugged in an old verification server because there was a glitch where by it was noticed that people who no longer had access to some accounts, once again did. Thankfully the error has been repaired, it does however leave a bad feeling about the verification process and about how it was skipped with just a glitch in the system. Hopefully it’s not an easily repeatable error, as having access to site information you’re not supposed to have can likely be a chargable offense.
The small story about DuckDuckGo has some interesting implications for the still somewhat small search engine. DDG has prided and formed itself around the idea that it does not collect user data and as such, you get “clean” results each time you search. The idea being that the most relevant should be able to always nab the top spots, regardless of your online activities. It was noticed however, that DDG was crawling under it’s UP, but it wasn’t coming up as displaying it’s own useragent – a way for site owners to determine who visited their site. The answer from the horses mouth was fairly basic, but depending on how it’s information that it returns is interpreted into it’s index could have some interesting SEO implications.
What you’re seeing is not a crawler, but a parked domain checker. We don’t believe it needs to be identified as it only makes one request very infrequently and doesn’t index any information”
There’s been a case of defamation in an Australian court where it was claimed that Google (knowingly) defamed someone by tying him to organized crime, both in organic and image search. Google was found guilty by jury, and has been ordered to pay a fine of what amounts to about 30 seconds of work for them ($200,000), but it’s not the fine that has the company a bit worried, it’s the precedent that it would be setting. Google is currently in the process of appealing the decision, we’ll all have to wait to see what happens.
The case was launched off of the search results for both organic and image listings which showed the claimant with ties to the local crime scene. Google responded that they’re not in control of the results page, that they merely list what has been observed as being popular search terms for the area. It sounds like a weak argument, but you can see how Google tracks their top trends by looking at Google Trends, you get a very brief glimpse into what the top searches were for the last day or so.
Back to why this is a bad idea however, to hold Google accountable as a publisher, and not as an information provider. The jury in this particular case decided that Google was guilty as a publisher and created the page which delivered the false information, and the images pages that are served up when you search are Google specific creations. As anyone who has any experience working with images online can tell you, there is the alt tag which can be used to give an image a text like value, which can then be indexed by the search engines. The image results page is actually the most recent target by black hat manipulators the last couple of months, not only because of this feature but it helps them get listed much quicker than pushing for listings in the center of the page.
Google being declared a publisher of the search results pages makes them accountable for the comments that came up in search, even though they never actually created the content themselves. It’s happened a handful of times that have made the news in recent years, with Rick Santorum being the most recent victim of results page manipulations by spammers and some other unscrupulous methods, but the results pages were driven by the users and by the most frequently used search terms. Blaming any search engine, not just Google for the aforementioned issue is like blaming your mechanic for your bus being late getting you to work. Once something is on the internet it’s also notoriously difficult to try and remove, ask anyone of the stars out there who have unflattering photos which pop up from time to time, once it’s online, it is forever. This also brings up the point of online brand protection, and the importance of a positive relationship in the local scene, with proper brand management mistakes like this can be captured and stopped before they begin.
The ruling sets a scary precedent in a way, as if it stands then it opens the door to an increasingly censored internet. Add into the mix that the ITU will be meeting in just a couple of weeks and the issue of net neutrality and freedom of use and access starts to become a threatened point.
Search is a finicky thing on it’s own, let alone when you start throwing all sorts of (seemingly) random variables to serve the results pages. Both Bing and Google have their own set of checks and balances which they use to deliver the results page based on your search terms. As varied as the internet is, there will be metrics that both of the algorithms use, and the differing ones are those that make the search results unique in their own way. These algorithms that are in use have developed and grown over time, as has the search market and the way it functions as a whole.
The search market started out in a very basic way, you typed in the terms you wanted to find information on, and the spiders searched through their index that they’d built and tried to return to you the results they felt best matched your request. The query you used was taken by the spiders and they searched for the exact terms and anything that matched it, search began as a relative function. As the definition goes, relative means in relation to, so if you searched the term ‘red rose’ as an example, you’d not only get images and descriptions of flowers, but you’d also likely end up with pages of the baking flour as well. Both items are relative to the search term you’ve entered, so it would make sense to a bot to show you both, as it couldn’t discerne what you were searching for.
Now the web has grown up a lot, it’s started to mature and has developed some, almost scary, tricks. It’s a term which has been thrown around a lot in the last 6 months especially, but it’s regarding the growing nature of semantic search. The simplest way to describe it would be with that same term from relative search – ‘red rose’ – with the way the web and search is evolving the bot would act intelligently. It’s being seen more and more often in Google, Bing and the social networks out there, because you’re an avid baker, the bots would likely serve you results pages more populated with the baking flour, and associated websites with recipes on it. Now it won’t bet the farm on you wanting the baking results, so you’d also receive some of the flower on your page, but it’s a best guess situation.
Semantic search, and likely presumptive search is the way we’re heading. Soon you won’t even really have to search for an item or a website, the bots, or whatever technology it is running things at that time, would know what you’re looking for within your first few terms you type. It might seem scary, it may even seem intrusive at this point in the way the world works and how people think. The simplest truth however is, this is where the web and search is going. It also means that from my point of view, the job of online branding and branding online will become vastly more important than it is today.
When we build a website for a client, whether they’re in Winnipeg or anywhere else in the world, we make sure that any kind of forward thinking marketing is covered. And since we’re in the business of online branding and internet marketing, we try and make sure that each website we develop has the capabilities to become a leader in their niche, so long as they decide they want too. We didn’t just come up with some arbitrary stats which we settled on, there are some very specific points that we look for. We’ll go over a handful of the options this time around, if you’re involved in the industry in any way, you’ll probably recognize some traits in the platform you use.
One of the very first points that is a necessity, is being ablt to customize page titles, and the meta tags of each page. If it’s a properly built website, and you’re following the best practice guide that both Google and Bing have readily available, then you should know already that having an identical title or tags on all of your pages is a big no-no. You should at the very least be able to customize each page title, meta data, and your header tags, if you can’t manage these very basic snippets of information on your site, then you’ve already started off on the wrong foot and we haven’t even gotten to the hard stuff yet!
I’ve touched on this point several times, but when you’re building your site you need to think about the navigation menu. And I’m not referring creating a singing and dancing menu that thanks a visitor for being a part of the website experience, I’m looking more at a navigation menu that uses CSS to control the display elements. You can have an impressively interactive navigation menu just by using CSS elements, which are easily indexed by all of the search engines and are much more responsive than a java or flash equivalent. Besides being responsive and a solid display method, it also allows you to control the contents of the menu, so if you happen to make a spelling mistake, don’t be surprised to find it indexed if you’re not paying attention.
It is an often overlooked feature, as a normal site owner doesn’t usually think about the website link beyond the main address, but being able to control how your URLs are created is a major point where best website development practices are concerned. If you’ve ever been on a major online shopping site like Ebay for example, if you’ve ever copied and pasted a link of a page to an email you’d notice the link contains a mess of letters a numbers (=item20cdb2380c&_uhb=1#ht_599wt_1139). These letters and numbers aren’t there for users, they’re definitely SEO unfriendly, and need to be avoided at all costs if possible.
These are only a couple of the very basic best practices that you’ll find discussed in any of the website development guides out there. If you’ve got the time, you should work your way through your site and if you have it, your CMS backend and ensure that you have all of the above listed functionality. If you’ve learned that you don’t have these capabilities, get in touch with us here at Freshtraffic as soon as possible and we’ll get that taken care of for you. The longer you wait on necessary changes like the above, the deeper you could be lost in the results pages.
It’s no secret that Google is the big kahuna where search is concerned, and they make enough money year after year they should have their own printing press. But for the last year or so especially, Google has been the target of some anti-trust and privacy issues across the globe, with advocates pushing for more from the search giant. Claims that it takes too long to clean up your past from the search engine, and blaming the provider for results deemed inappropriate.
The web is at it’s core, a giant repository of everything. Pictures, videos, text, scripts, code and trillions upon trillions of 1s and 0s that make up websites and documents. It is often a strange sensation to be able to go back to an old website you used to frequent, read some of your past ramblings and wonder, what was wrong with me, or, why would I write something like that? With the way the internet holds onto its history, you can often find information about anything or anyone for that matter. You would be hard pressed to think up a legitimate search topic that wouldn’t appear on a search engine somewhere, and it’s highly likely that Google as well has it indexed and stored on one of it’s multitude of data centers across the globe.
It’s that level of access to information that seems to have the hackles of some of the population up, and has them trying to call for regulations on search engines. Soon it won’t be just Google that will be caught up in these privacy and anti trust regulation talks. Google is being made an example of because they’re the biggest target out there, and so, who better to hit. The plain and simple point of contention of access to information isn’t a search problem, I’d blame it more on a generational divide. The yougest users of the web, those 13-18 year olds have grown up with 24/7 access to the web and all of it’s content, while the top end of the user range, that 65+ age range, sees the internet in a completely different way.
40 years ago when a family went on vacation and took snap shots, they didn’t share them with 400 of their friends on a social network. It was maybe the 6-10 close family friends that they shared their details with, and so they could control their information and had a semblance of privacy. Flash forward to now with the same family, and you have little sister posting pictures to Instagram and Facebook, while the 17 year old son is watching a steaming Netflix movie. Mom and dad are using a GPS navigational system with turn by turn functionality, and are setting up a video chat with the friends they’re on their way to visit. Everytime that photo is viewed on Facebook or Instagram, it’s being saved with another web address, in another location. Everytime you’ve used your Skype or iPhone to conduct a video call, the connection and duration has been saved on a data server, and every movie or show you stream online has helped define what your likes and dislikes are with the service, so you can have a better targeted product to view at a later date. It used to be called personal accountability, if you didn’t want to be viewed in a certain way, you just didn’t act that way, and it’s become even more important to conduct yourself well.
Privacy hasn’t disappeared, but it’s definitely not the same as it was 40 years ago, as a person living in the digital age you need to be acutely aware of your online conduct. Because everything you say, do, or post is saved somewhere. Google, Bing, Yahoo, and all of the other search engines just search for information. They do not operate with bias or under the control of some megalomaniac with a god complex who is out to control the world. All they do is take a mess of 1s and 0s, and display them in a way that a person can understand them. And just remember that the information that people are trying so hard to push Google to bury, erase and hide, can be found just as quickly on the other major search engines out there.
Online marketing and branding is can be an intensely competitive market, made even more difficult with there being billions and billions of web pages out there about everything you can imagine. And while they say imitation is the sincerest form of flattery, it can tend to be a death note where the search engines are concerned.
With the web being so massive, it’s can be often difficult to say where content originated. Images get copied, text gets scraped and snippets of code gets replicated across the web on untold amounts of websites. Where organic optimization is concerned, it’s a time intensive process to prove original authorship in some cases, and even then it may not make a ton of difference. There is a difference however, where paid advertisements are concerned, such as with Adwords campaigns.
Adwords is a much different platform from organic search, the biggest being you’re paying for your positioning in the results pages. You bid for your chosen keywords, and if your ad copy and your bid are better than your competitors then your ad will appear, frequently before theirs. It’s a lucrative search market namely because it’s where people make their snap buying decisions. Sometimes, there are companies out there which play a little dirtier than others, sometimes copying ads copy directly, or even copying ad titles and format. It is a dirty business practice, and you can compare it to Pepsi mimiking a Coca Cola commercial or tune.
As dirty as it is to copy your competitors titles, copy or entire text, due to the nature of the business they may be allowed to run the ad, that is of course unless you dispute their usage. A prevalent argument that is often found in these cases falls under the Adwords informational site policy, a long winded document that exists to cover the usage of trademarked terms use in Adwords. It basically limits the use of a trademarked term to the original mark holder, or a reseller of the product. The loop hole exists however, when you get to the portion of informational sites, which can carry the trademarked text if the landing page of the ad is informative in nature to the written ad text. Now just because the loop hole exists, it doesn’t mean you’re out of luck if your competitor runs an identical ad using your text, your primary step should be to file a dispute in your Adwords account against the ad. You’re also covered in the same trademark policy text where it basically says you can’t use a trademarked term if the goal is to take sales away from the trademark holder.
Make sure to be diligent with your Adwords copy, and if you see someone using your very own text to try and snag away sales then you should be reporting them as soon as possible. If you let it slide, there’s nothing stopping you from losing your next big sale.
Recently Google went and turned on their own tool which enabled website owners to disavow selected backlinks coming to their site. Great tool, that allows a diligent site owner to take control over who links to them. The process is fairly basic as to the steps to follow, you create a text file which you upload to your webmaster tools account with the backlinks you’d like to have disavowed and voila, supposedly case closed.
It seems however, that some people aren’t content with the way the system works. After submitting his disavow list, and resubmitting a reconsideration request they were greeted with the advisory that there was still some bad links pointing to his site. The timeline with which this webmaster is unhappy with, has been a month since their initial submission of their disavowed links. There are a couple of theories about why there are still some problems, but there are also a handful of points that all webmasters who use the disavow link tool need to bear in mind.
A primary point you need to think of when using the disavow link tool, is that it is not an instant or a quick fix tool to any and all back links you might want to remove. Google has data centers all over the globe, and with that it has a number of different versions of your website at any given time. As odd as it may sound, it’s like using a collation system when working on a project through various stages of completion, so when you’re finished you can see what your steps were all the way through. Just like you could look at version 2 of your project development and have an idea of where you were, each data center will have a slightly different version of your site and it’s backlinks. It takes time for any kind of a clean up request to propagate through the entire system.
A second major point that needs minding, is you need to understand that just because you’ve submitted the disavow list, and/or asked the offending back links to be removed, it doesn’t mean it’ll happen quickly (as per the first point) or at all. The tool funtions much like asking another site owner to remove a link to your site, it’s a request, and if it happens you have no control over how quickly it does.
So finally the election is finished, and the winner has been decided. If for some reason you’ve been living in a cave the last couple of days, Obama took the crown and is set to begin his second term as the President of the United States. And regardless of who you were rooting for, there were some interesting search discoveries over the last couple of months of the battle, which have their roots in search.
A few days back, there was a story run in the Wall Street Journal about how Google was serving up results pages in what some were thinking was a strange coincidence. It seemed that even with being signed out of a Google account, and being on a cookie free browser, the results when searching for Obama almost bcame personalized. The article that was published even went on to say that the search engine was biased when searching for obama and related news, with one story coming right out and saying that the candidates were being treated unfairly. While it would make for a great conspiracy story, the unexciting truth is that it’s just how the Google algo works. Google simply displayed results based on how people searched for terms, the example being
more people searched for “Obama” followed by searches for “Iran” than the number of people who searched for “Romney” followed by “Iran.”
That was the first interesting point, the second follows in a similar vein.
It’s not really news anymore that between the candidates there were hundreds of millions of dollars spent on campaigning, but it was interesting to find that Obama out bid Romney on search ads online at nearly three to one. Both were bidding on the big hitters like ”2012 election” and “2012 presidential polls” to lead people to their campaign websites, but it was the former President who owned the paid advertisements of the results pages. Sticking in the trend of online visibility, Obama had Romney beat across the board with more Facebook fans, website visitors and Youtube video views.
The largest demographic in the voting populace is shifting to a much younger, information hungry crowd, so being able to be found online should be an integral cog in any parties agenda. When you shake all the numbers out from organic results to paid search, it looks like in the end Obama simply out optimized his opponent, and as helped secure himself with a second term.
There’s been a number of changes in the search world over the past 15 years since it’s pseudo birth, but the changes that have happened in the last 12 months have been some of the largest ever. There have been the Panda updates, the Penguin changes, and the EMD (exact match domain) changes that have made search engine optimization a much more interesting job. And not that they’re the only search engine in the game, but leave it to Google to make the most news with any change, seeing as they own the vast majority of the market.
I’ve outlined what can happen when you make a mistake and breach one of the rules set forth by the engines. You can take a rankings hit, you could suffer a penalty in the form of maybe losing some (Google) Page Rank, or you could even be completely removed from the index if you’ve accumulated enough ‘strikes’ against your website or url. As search engine optimization experts it is our job to ensure that ourselves, nor our clients fall into any of the multitude of pitfalls which you can find yourself in. None of these scenarios are unrecoverable, although making sure to get back into the good graces of the search engines will take some time and an extensive SEO skillset.
However if you don’t have time, or any search engine optimization skills under your belt, or maybe you don’t have the budget to bring in the real search experts, there is a solution for your business. It is one which will still take time, but you don’t have to worry so much about the SEO skills initially, because you’re going to start down the road of rebranding. Completely rebuilding your brand image is really a last resort option to take for your business, as it can take almost a year to return to the search results pages. If you’ve found yourself far enough up the creek that rebranding is a more viable option than repairing the mistakes you’ve made, perhaps it’s time for an evaluation of your job description.
And finally after being patient for the last few months, site owners with a Google webmaster account have the final say over how links to your site are treated. From the Google webmaster blog:
Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue.
This is going to be a great tool to add into your toolkit if you use webmaster tools directly, and if you don’t you should check that your site manager is keen on what the tool can actually do for you. A quick rundown of how links to your site affect you – you create content, and if it’s unique content that is relevant to your niche then users will generate a link to that page. These links are used as a factor when determining relevance in the results pages for the terms you may wish to rank for, and if you’ve created great content then the links will follow. More links is used as a measure of relevance, so the more the better. Well there’s a downside to links and that happens if you have too many ‘unnatural’ links pointing to your site. That would be having links from a plumbing site pointing back to your website on shoe sales, the two topics are irrelevant to each other. The recourse you had as a site owner in this instance was to contact the website that posted the link to your site and asked to have it removed, it was then out of your hands and left for them to deal with, and until it was you could be handed a stiff penalty from search engines.
The problem with that scenario is after you’ve notified the site owner to remove your link, you no longer had control of what happens next. But with the addition of the disavow tool in Google, you can now take matters into your own hands and manage the backlinks coming into your site. it’s a great step in cleaning up the web and improving the relevance of the search results overall. You can find out more about the disavow tool here.