Browsing "internet news"
Lots of noise being made in the social arena over the last week or so, Google lauding it’s social network, Google+, by saying that over 90 million people have joined so far. It’s still just a drop in the social bucket really compared to Facebook’s 800+ million, but some have started to question just how genuine Google’s numbers are.
Reports have been growing online that the process to sign up for a Gmail account has changed from being just a simple matter of choosing a login name and a password. It seems that now, you’re required to fill out a social profile for your newly created Google+ profile. To say it’s irked people who were previously considering getting a Gmail account would be putting it mildly. A work around was quickly found for those interested in only a Gmail account, but that functionality obviously will not last. And although I haven’t noticed any changes in my search results, there are other reports of people saying that Google+ is taking over their SERPs. It’s gotten bad enough that just today, Facebook and Twitter developers have released a ‘tool’ to remove the social results from your pages named of all things, “don’t be evil”. The tool is a bookmarklet that you place in your shortcuts bar which can ‘fix’ your results if you’ve found they’re full of Google+ results.
Throwing another curve into the already twisted recent social signals that Google is putting out, in the earnings report that was just put out, Larry Page was ecstatic about the reported ‘gains’ they were making in the social arena. It’s not really a surprise that Page would be so excited about Google+, as when he took the helm last year he was throwing out big bonuses for improvements on the Google social front. It’s a rocky start for the online giants, but it’s a start which will make 2012 an impressive year for social media.
There’s been a shift in the algorithm lately, as most in search are aware of and it may have a little to do with a Google blog post recently put up. In it, Google commented on websites with heavy advertising at the top of the page, forcing visitors to search for the actual content and about how things are going to change.
From their post it reads that there are complaints about having to basically search for results twice on the web. When a user clicks on what they deem the most relevant to their search and they’re greeted with a website which is heavy with advertisements on the top of the page; they’re needing to search the website again for their content. Top heavy advertisements on websites were mentioned as being so heavily populated with ads, a user has to scroll to even begin to find content. Google also mentioned that for going forward, those websites which are advertisement heavy, ‘may not rank as highly as before’.
Now don’t fret if you have an ad block on your website, you’re not going to be kicked into the basement of search. This change is going to affect less than 1% of searches conducted globally. So unless you’re in the business of having tons of spam on your webpages, you should be just fine. As usual when there are any kind of algorithm changes, no doubt in a few days the ‘end of SEO’ will be heralded online for the beginning of the year. But those who have been playing the game from the beginning, who helped shape just how search works and hammer out the rules, they should be the ones you watch. Until it comes from one of the real search experts that SEO is dead, I’ll just keep on plugging along.
In just a few more hours, the internet will be officially on strike in demonstration of the resistance being put forth versus the SOPA bill currently trying to be pushed in the States. There are a great number of sites ‘going dark’ in support of the event, with the idea that it will give web users an indication as to what the web may be like, should this bill, or any like it, come to pass.
Some popular web destinations like Reddit which bills itself as the front page of the internet, Facebook, Wikipedia, Tucows and even Google are gearing up to participate in tomorrows black out. There is some very strong language circulating where SOPA is concerned, going so far as to even name the bill an attack on free speech on the web. Those who support the bill being passed are deeming the blackout as a knee jerk reaction, trying to emphasize the bill in a bad light, the documents are out there for you to make your own decision.
Google has come along with a handy dandy guide as well, to assist with going dark in support of tomorrows events, which also come in handy should you need some downtime to work on your website. The advice comes in the form of enabling a 503 header return for your website which is your way to tell web spiders to come back later, your site is currently unavailable. This handy implementation will work should you decide to be part of tomorrows protest and come in handy should you need to work on your site in an emergency.
It’s fairly easy to return the 503 instruction for bots, especially if you have root access on your server. Adding the following line to your .htaccess file can take care of it for you:
RewriteRule .* /path/to/file/myerror503page.php
just adjust the instruction in accordance to your webserver. What this will do, is redirect visitors to your site to the error page for you, as well as taking care of any spiders poking around so as not to thwart any search engine optimization efforts you’ve been working on for your site.
So Google made a little bit of a blunder with their Chrome advertising it seems and what was the end result? Well perhaps the best way to understand what happened and it’s ensuing result, the algorithm needs to be a little more understood.
The Google search algorithm was intentionally designed to go out and read as much of the content of the web as it could find. It pays no heed to race, color or quality of the content. It doesn’t care how pretty your pictures are, how impressive your flash intro is or how quickly you can flip through your menu items on your navigation bar. It takes in the content of the web and spits it out when you ask it a question. It’s because it’s so simple that there needed to be filters put into place and penalties levied against people who either managed either by accident or on purpose to get around the quality controls put in place.
Paying for pagerank, that intangible mega star of the Google world, is a heavily punishable offence in the quality control guidelines. So it came as a rather big surprise when it was found suddenly, that Google was seemingly paying for advertising which was passing pagerank to its Google Chrome web page. The skeptics of the web automatically assumed that the Google machine would just shrug, apologize to the web, as they didn’t intend for it to happen, and everyone would be on their way. The outcome however, was actually the opposite.
Matt Cutts, via is Google+ account had the following to say of the incident:
“Google was trying to buy video ads about Chrome, and these sponsored posts were an inadvertent result of that. If you investigated the two dozen or so sponsored posts (as the webspam team immediately did), the posts typically showed a Google Chrome video but didn’t actually link to Google Chrome… we did find one sponsored post that linked to www.google.com/chrome in a way that flowed PageRank.. we only found a single sponsored post that actually linked to Google’s Chrome page and passed PageRank, that’s still a violation of our quality guidelines”
So okay, it was found out there was a minor slip in what was intended and what was the actual result, so what did they do?
“In response, the webspam team has taken manual action to demote www.google.com/chrome for at least 60 days. After that, someone on the Chrome side can submit a reconsideration request documenting their clean-up just like any other company would. During the 60 days, the PageRank of www.google.com/chrome will also be lowered to reflect the fact that we also won’t trust outgoing links from that page.”
If anyone ever questioned as to whether the machine would point it’s gun at itself, question no longer. As the webspam team has shown, no one is above the rules set with quality searches in mind. So bear in mind when you next work on your websites SEO, ensure that you’re following the best practices and the search guidelines readily found all over the web, else you’ll find yourself flung deeper into the ranks than you could imagine.
Did Bing play dirty over the shopping holidays? If you tried at all this most recent Cyber Monday to use the Bing search engine, the signs currently point to yes, they did play dirty with their results.
The creators of the idea of Cyber Monday, found themselves lost in Bings search listings because according to Bing their content was too “thin”. If the term is familiar, it’s because it sounds a lot like Google-speak when they started rolling out the infamous Panda updates and culling “thin content” based websites from their index. A difference to note however, Panda didn’t actually remove the offenders from the index, it just meant the odds of those sites ranking well plummeted.
Back now to Bings version of taking care of thin content and removing websites which fall into this category. Cyber Monday is now a billion dollar online shopping event, where website owners have the opportunity to make some good money heading into the holiday shopping weeks. If a site which could promise and deliver strong referrals could rank well, they would also stand to make a fair bit of change. Shop.org came up with the term Cyber Monday in ’05 and a year later created the corresponding website, cybermonday.com. This past Cyber Monday Google had the website in their SERPs, while Bing did not. Bing did however, have their shopping channel listed at the top of their results for searching cyber monday.
Bing has stated previously that they will dispense internet justice on sites deemed unworthy to be listed as part of their SERPs, but completely removing any and all traces of a site? Bing defines spam as:
Some pages captured in our index turn out to be pages of little or no value to users and may also have characteristics that artificially manipulate the way search and advertising systems work in order to distort their relevance relative to pages that offer more relevant information. Some of these pages include only advertisements and/or links to other websites that contain mostly ads, and no or only superficial content relevant to the subject of the search. To improve the search experience for consumers and deliver more relevant content, we might remove such pages from the index altogether, or adjust our algorithms to prioritize more useful and relevant pages in result sets.
So by removing the cybermonday.com website, if Bing were to stick to their guidelines they should remove all “thin” websites which fell under the same blanket. Yet they did not entirely and websites which feature almost identical content to the cybermonday.com website still appeared in their results. To further muddy the waters, the Bing powered search results which were served up in Yahoo would turn up Black Friday “websites” which would be deemed even thinner than the Cyber Monday website. With all the fuss that Bing was putting up about Google favoring their own results over all others, this sure doesn’t look well on the Bing radar. The Panda updates may drop websites rank if they’re found as being too thin a website, but at least they’re not completely removing them from the index ala Bing.
Is Bing more biased than Google when it comes to the results pages? In research that has been gaining traction as of late, the answer seems to be yes. It wasn’t a directed study on a few select terms either, it was a large random sampling of the SERPs conducted by a professor at George Mason University.
What he found in the tests that he conducted was that for the most part, Bing will favor Microsoft content more often and more prominently, than Google favors its own content. According to the findings, Google references its own content in its first results position in just 6.7% of queries, while Bing provides search result links to Microsoft content more than twice as often (14.3%). The percentages may seem small, but when you consider there are billions of searches performed daily, suddenly 14% isn’t such a small number.
The findings also cast a different light on the recent FTC antitrust complaints which Google has been handling surrounding anti-competitive behaviour. It’s also a stark contrast to a similar study done earlier in the year, which concluded that “Google intentionally places its results first.” So now as a user with two completely different data points, which is the set to believe?
Well the second study which has been conducted had two goals in mind : To replicate the findings of the first study and to also expand on the methods used to determine if it was perhaps an issue in how the results came about. From the very beginning, it was found that while Google does favor its own content at some points, the selection of terms is exceedingly small. What was also learned and wasn’t mentioned in the first study, Bing does precisely the same in preferring Microsoft results, but for a much wider range of terms than Google does and is much more likely to do so. “For example, in our replication of Edelman & Lockwood, Google refers to its own content in its first page of results when its rivals do not for only 7.9% of the queries, whereas Bing does so nearly twice as often (13.2%)”
As for the second part of the study, the study used a much larger, more random sampling of search queries as opposed to the only 32 samples that the first study used to portray Google as the big bad guy of search. And the findings of the second study were related in the beginning of the post; Google references own content in its first results position when no other engine does in just 6.7% of queries, while Bing does so over twice as often (14.3%).
So, what does this mean as an end user?: Google (and Bing, though less so) really are trying to deliver the best results possible, regardless of whether they come from their own services (local search, product search, etc) or not. It all comes down to preference.
So remember a little while back when Google decided to try the whole social thing and launched Buzz? And it cost them a few million because they “oops forgot privacy”? Well the FTC has finally decided how to handle the giant and it will throw a bone at the privacy concerned members of the public to boot.
For the next 20 years, Google will be subject to privacy monitoring from the U.S. Federal Trade Commission. By using what are being called “deceptive practices”, the FTC will babysit the search giant thanks to its now dead Google Buzz social networking service. The investigation into Google’s privacy practices began after users complained that their Gmail contacts were made public, and that the steps to protect their privacy weren’t clear or effective.
When the service launched, Gmail users were given the option to participate in Buzz, but what Google failed to mention was that the people they email most often would be listed publicly. Those users that declined to participate were also automatically enrolled into at least some of the Buzz features without their consent, according to the FTC investigation.
“In response to the Buzz launch, Google received thousands of complaints from consumers who were concerned about public disclosure of their email contacts which included, in some cases, ex-spouses, patients, students, employers, or competitors,” the agency said. “The FTC charged that Google failed to disclose adequately that consumers’ frequent email contacts would become public by default.”
The FTC also added that Google had misled the public in regards to its privacy policies, and misrepresented its compliance with U.S. and E.U. Safe Harbor “or other privacy, security, or compliance programs.” So for the next 20 years Google is going to have a monkey on it’s back with the FTC being able to watch their every social change and if the search giant isn’t careful, may find itself back in hot water.
It’s been just around a month now that Google+ became open for business, and Google remains undaunted in its effort to go toe-to-toe with Facebook.
Vic Gundotra, vice president in charge of Google+ said, “We are in an enviable position that we have people who come to Google, we are in this for the long haul… By Christmas you will see Google+ strategy coming together.”
Google+ has attracted more than 40 million users since it opened to the public , but has a long way to catch up with Facebook’s membership of approximately 800 million.
Google is looking at tying all of their current Apps and extensions into Google+ accounts, the goal being able to synch the whole mess together with Docs, Youtube etc. Eventually, Google aims to open the platform to outside developers to make games and other kinds of installable “apps” that have been part of Facebook’s success.
Google is moving slowly and cautiously to make sure its social network is a safe, stable haven for families, friends, and other associates who connect with one another in “circles” created at the service.
Gundotra acknowledged that Facebook has the advantage of a “network affect,” in that complex webs of friends are established there and people might find it daunting to up and relocate to Google+.
“The incumbent (Facebook) has a huge advantage, if you play the same game, you are not going to win… So we are going to do it differently.”
One of the larger contrasts between the two networks, Google+ offers much more discretion on what you share, with whom.
“We do not believe in over-sharing,” Gundotra said. “There is a reason why every thought in your head does not come out your mouth… We think a core attribute to being human is to curate.”
Google+ launched with a requirement that people use their real names online in order to let others find them more easily, but they are aiming to eventually allow people to use pseudonyms on your account as opposed to your real name. It’s been a thorn in the fledgling social network since early in it’s beta incarnation.
“We wanted this to be a product where you can discover people you know,” Gundotra said. “You don’t know ‘Captain Crunch’ or ‘Dog Fart’.”
Based on the rest of the discussion from the conference, it’s looking like Google can’t wait for Christmas to get here.
Around 18 months ago Google announced that it had a new search interface for the privacy concerned. This encrypted search, which encrypts both queries and results, was launched with the wireless user in mind I’d imagine. Seeing as it allowed for a level of privacy normally only enjoyed by a wired internet connection.
Now fast forward to today, on October 18, Google announced that it would begin pushing users with a Google account to Google’s encrypted search homepage. The move towards making search more private has some in the SEO sphere a bit troubled. Google is approaching this from an interesting angle as recently it’s being discussed that analytics is going to be changing as well to a different model of delivering search metrics.
The flip side of offering more secure searches and results to WiFi users and the portion which has some in the SEO community worked up, it also means that searches performed and returned in this manner won’t display the keywords which were used to conduct the search. Google search product manager Evelyn Kao wrote in Google’s official blog,
“When you search from https://www.google.com, websites you visit from our organic search listings will still know that you came from Google, but won’t receive information about each individual query.”
Paid search results will still pass on the same information as in a non-encrypted search. The only information which will be available will be from webmaster tools and even then it will only provide the top terms for the last 30 days, with no details as to which pages were visited on site. That’s the scary side if you’re an inexperienced SEO who may work on the darker side of the grey scale.
The other half of the coming story, Google hasn’t released any information as to how many signed in users perform searches. With the deeper introduction of real time searches, friend shares and the like, I’d be inclined to believe that it’s going to take a fair while before there’s any sizable changes in the SERPs.
The anti-trust hearings versus Google and their supposed stranglehold of the web has been continuing in front of the senate. There are people on all sides of the argument it seems, Google on the defensive, Microsoft and a few others decrying that they’ve been wronged by the search giant. And one of the most basic arguments that Schmidt has used to rebut all of the claims of unfair business could very well win the day. Schmidt’s defence basically says:
“Google faces competition from numerous sources including other general search engines (such as Microsoft’s Bing, Yahoo!, and Blekko); specialized search sites, including travel sites (like Expedia and Travelocity), restaurant reviews (like Yelp), and shopping sites (like Amazon and eBay); social media sites (like Facebook); and mobile applications beyond count, just to name a few.”
Now on one hand, yes Google can provide all of the services that are available on the web, but there are simply better options. If you’re big into social networking, Facebook is still the king, if you travel a lot you use Expedia to find tickets and deals. I’ve personally used Amazon, eBay and Kajiji to post and purchase items and even the smaller search engines like Blekko have their place and a few tricks that Google just can’t do.
So Schmidt’s argument that there are options available online, users just need to navigate to them, is utterly true. Google doesn’t so much have a dominance of the internet, as it has a dominating presence in the search arena. And there are many out there who would point out, Bing, Yahoo and the littls start ups like Blekko which come along, chip away little by little at that armour. Google’s search advantage or position isn’t going to disappear or diminish in any great capacity until a revolutionary game changer makes itself known, just as Larry and Sergei did with Google.
So don’t worry about Google’s “dominating web presence” so much, instead use your keyboard and mouse and investigate the alternatives. Just because one site offers similar products, doesn’t automatically mean you have to use them. After all, you wouldn’t call Coca-Cola to order some Pepsi.