They boil it down, and compare the current partnership, to the one in 2008 which the Department of Justice blocked, citing a monopoly. One of the bigger differences in the current case at hand, is apparently Google already has Japans Federal Trade Commission blessing, and has had it for some time. The next few weeks in Japans paid and unpaid search industry will be a doozy.
Google made a bit of a mis-step in the public arena already, what with their bummbled release of Buzz, what with making everyones information public to their email list. But in following the trends, Google has been dumping cash into Zynga, the creators of Farmville etc, in the neighborhood of $100 million.
If the hype is to be believed, Big G is working on another step into the social arena, perhaps this time on the gaming front. Farmville, Mafiawars, etc have huge followings within Facebook. Farmville alone has upwards of 60 million players in their portfolio, and it’s climbing. When asked directly however, if Google is entering the same social space as Facebook, Google exec. Eric Schmidt said:
“..the world doesn’t need a copy of the same thing.”
There’s a lot of speculation about Google versus Facebook versus Google, and while they’re both the major players in their respective arenas, they’ve both fell short when trying to enter the others space. Facebooks internal search is clumsy, and works only with their own pages, and Googles social platform was dogged with security and privacy concerns. With the money being pumped into social gaming, and Google already owning Orkut, speculation would lead to the idea that Big G does have something up it’s sleeve. Just what that might be however, remains to be seen.
It was only a matter of time really. Previously the DoJ in the US was looking at the data Google had collected during it’s Street View runs, and was holding it’s cards close to it’s chest. Some of the individual states however, have taken their own road, led by Connecticut AG Richard Blumenthal.
Blumenthal says 38 states and the District of Columbia will be participating in the investigation, with Florida, Illinois, Kentucky, Massachusetts, Missouri, and Texas on the executive committee. Other states joining the coalition include New York, Mississippi, Vermont, Nebraska, North Carolina, Oregon, Washington, Kansas, Montana and Rhode Island.
The whole mess kicked off when German privacy concerns launched a probe into the Google Street View collection practices. Discovering that the software was not only picking up open and unsecured wi-fi points, but was also collecting any data which was passing along the connection. Blumenthal main point of contention, is that the answers Google gives, only serves to present more questions than are answered. When the wi-fi software was found in the Street View program for example, it wasn’t known that there was tangible, usable data contained within it. Oops?
It’s being asked whether or not the specific persons involved with implementing the code snippet will be identified, and how is it that Google wasn’t aware what the code was fully capable of. It seems rather far fetched that it would have gone completely unnoticed.
On a lighter note..
Almost a Facebook Nation..
As the overseer of the “third largest country” in the world, it’s no surprise that Mark Zuckerberg, Facebook CEO, denies signing over an 84% stake in the company for a mere $1000.
After having a brief commemoration of the site turning over the 500 million mark, Zuckerberg admitted that the privacy policies on the site were handled poorly.
“We’ve made mistakes for sure, I think they’re a lot better.”
When pressed as to why personal information isn’t automatically set to full private, the answer was basic, Facebook is set up in a way to enable people to share. Adding however, that ideally having certain information always private would be a step in the right direction.
Another D-Day is looming on the horizon, and website owners are going to be learning another step to the SEO dance. It’s been in the works for the last while, the Yahoo-Bing search results merger, and in a recent press release from Bing, the proverbial trigger was pulled.
For webmasters, it’s important to be familiar with how the Bing crawler interacts with your site. After the full algorithmic transition is complete, you only need to optimize for one crawler (Bing), as we will provide Yahoo! with results from our index.
All of the little tricks, optimizations and tweaks that we’ve learned over the last year, can be trimmed down to the Bing bones as it were. In other words, don’t be surprised if your site shuffles and changes in ranking on Bing abd Yahoo, depending on which secondary search you work with.
You can find the entire press release issued, from their senior VP of their online services division, here.
Yahoo is up, Google is down, Bing is in the mix and on average Facebook isn’t trusted. At least, if you believe the numbers based on American Consumer Satisfaction Index (ACSI), which tracks general consumer satisfaction levels with websites. This was the first time social media was included in the survey.
What was found on average, was that social media platforms returned an average rating of 7/10, a fair step below portals and search engines and news and information sites.
The survey looked at Wikipedia, YouTube, Facebook, MySpace and “all others.” Twitter wasn’t included apparently because so much of Twitter’s access comes from third party clients. As mentioned the category average was 70. Facebook scored a 64, while YouTube scored a 73. The generic “all others” received a 72 mysteriously.
In the laundry list of complaints about Facebook, privacy and security were prominent concerns. Also included in the mix, but not limited too were, advertising, the constant and unpredictable interface changes, spam, annoying applications with constant notifications, and functionality. Age was a variable in the equation, as it was found that older people rated Facebook lower, while the younger, more prevalent population of the website listed less concern. As of late however, the largest growing segment on Facebook is an older generation, so according to the numbers, Facebook may want to take a look at how the ship is being steered.
The ACSI numbers aren’t concrete in the sense that they can make, or break businesses, they have however proven to be a metric worth considering. The report in it’s entirety, is an all encompassing baseline which can possibly identify improvements which can be made for your consumers.
Last week a piece was written in the New York Times, which suggested heavily that Google and it’s algorithm needs to be taken in hand, and monitored. Using examples like financial incentives, handling 60%+ of the web queries worldwide and how Google can break small business owners with a shift in ranking; having the government decide what Google can, and can’t change within the algorithm was pressed. Make the algo public, let the government decide what tweaks can or can’t be made, and to determine in the end, what’s relevant for users.
Needless to say, it wasn’t taken too lightly. Danny Sullivan wrote an entertaining response, using the verbage from the article nearly word for word, replacign Wall Street Journal for Google. It’s an entertaining article to read, I suggest taking the time. One of the more enjoyable points for me, he compares in the end, the WSJ to Google, and at one point even has a comparison of the business’ bias in transparency concerns.
Google will list EVERY site that applies for “coverage” unlike the New York Times, which regularly ignores potential stories
If Google blocks a site for violating its guidelines, it alerts many of them. The New York Times alerts no one
Google provides an entire Google Webmaster Central area with tools and tips to encourage people to show up better in Google; the New York Times offers nothing even remotely similar
Google constantly speaks at search marketing and other events to answer questions about how they list sites and how to improve coverage; I’m pretty sure the New York Times devotes far less effort in this area
Google is constantly giving interviews about its algorithm, along with providing regular videos about its process or blogging about important changes, such as when site speed was introduced as a factor earlier this year.
June 2007, Google allowed New York Times reporter Saul Hansell into one of its search quality meetings, where some of the core foundations of the algorithms are discussed.
Who’s article rings of more truth to you?
Well, under a new patent which was approved this past week, Google will have an idea just what you do like to point at. The patent, titled “System and method for modulating search relevancy using pointer activity monitoring” was filed in 2005, and granted this week. The patent is described as a system for monitoring the movements of a user controlled mouse pointer in a web browser, identifying when the pointer movies into a predefined region and when it moves out of said region.
So basically you can think of it as using a hotbox link area for an image, or a div tag in CSS. Google can assign an area for analysis on their SERPs page, and track where searchers mouse moves. What type of information, and how it could possibly be applied in the realm of SEO is still to be determined. It could however, give Google a better understanding as to how well a SERPs page comprised of blended (both paid and organic results) results fares.
And, in the realm of satire, these headers are from a live website, who noticed Google flagged their website as a possible spam site. Internet cookies for those who can see what’s wrong. Who said Google doesn’t read meta tags?
<meta name=”author” content=”” />
<meta name=”alexa” content=”100″></meta>
<meta name=”googlebot” content=”noodp” />
<meta name=”pagerank™” content=”10″></meta>
<meta name=”revisit” content=”2 days”></meta>
<meta name=”revisit-after” content=”2 days”></meta>
<meta name=”robots” content=”all, index, follow”></meta>
<meta name=”distribution” content=”global” />
<meta name=”rating” content=”general” />
<meta name=”resourse-type” content=”documents” />
<meta name=”serps” content=”1, 2, 3, 10, 11, ATF”></meta>
<meta name=”relevance” content=”high” />
Shares of Apple stock fell sharply on Tuesday afternoon — down to $246.43 before a modest recovery — and as of Wednesday morning were down about 8 percent overall from where they were on the iPhone 4’s June 24 release date.
In just a short window of 3 weeks Apple has seen a fairly significant drop in their companies worth. There are always the fans of course, which will undoubtedly ensure the companies success, but it’s a good reminder for all businesses to be sure to listen to your customers. The technical issues and limitations of the iPhone were known about prior to launch and acceptable solutions weren’t attained. So instead of having a techological breakthrough (again), Apple instead is dealing with a problematic piece of hardware. The most recent buzz surrounding the iPhone and it’s issues, are the smatterings here and there of a mass recall in order to address consumer concerns; which would only further the impact. Analysts within the industry have suggested the technical issues surrounding the phone aren’t necessarily the problem, it’s more the way Apple is dealing with it. Apple maintains however, that despite a programming bug, the iPhone 4 is a fine product with solid reception.
Within the midst of all the iPhone buzz, Apple is moving forward with the purchase of a Canadian company, Poly9, which creates browser-based 3D software. Seeing as how Apple has shunned Flash from their phones, the acquisition of Poly9 may be another way forward for the company.
Google’s recently accounced it’s “build your own app” program for the everyday person who’d like customize their Android powered phone. For free. Apps that are developed with the platform can be listed in the android store with a nominal registration fee. Some have said this will lead to an influx of poorly designed apps, and others have used the argument that this opens up people to a new realm of spam.
Just to add to the mix, Windows has decided to toss their hat into the ring as well. On the expected arrival of the Windows Phone 7 platform, Microsoft has launched their own suite of developer tools.
A brief timeline from the Windows Phone Developer Blog:Feb 2010 – Windows Phone 7 was unveiled at Mobile World Congress in BarcelonaMar 2010 – The application platform was unveiled at MIX 10 in Las Vegas. With that, we had the first CTP of the Windows Phone Developer Tools.Apr 2010 – The tools received an updated, and the CTP Refresh shipped.Jun 2010 – Windows Phone Marketplace details unveiled at TechEd 2010.July 2010 – Beta release of Windows Phone Developer Tools, and the preview developer phones start shipping to ISVs
The iPhone has their apps, with quality guidelines and store and what not. With an SDK which isn’t terribly difficult to learn, but made for the technically inclined. Versus, the newest Android developer software, which allows virtually anyone the ability to create their own custom apps for their Android powered phone. And now the Microsoft version, allowing further customization of the Windows Phone 7 powered handsets. To add a little cream to their offering, free classes on how to fully utilize the Microsoft software are available. The premise:
It will provide developers a jump start for developing Windows Phone 7 applications.The dates for these course sessions are:July 20 – 8am: Session One: Getting Started with Microsoft Windows Phone and SilverlightJuly 20 – 1pm: Session Two: Programming Game Applications with XNAJuly 22 – 8am: Session Three: Programming Applications with SilverlightJuly 22 – 1pm: Session Four: Review and Wrap UpThis is a big milestone for everyone involved in Windows Phone 7 – inside and outside of Microsoft – and we hope you share in our excitement. With the Beta release of the tools, developers can build apps with a “ship it” mentality.
So now it’s turned into much more than just a handset battle, the software and apps powered by that software have entered the fray. With the power to be able to completely customize your cell phones functions and uses, to cater to your needs, the way of the paid app development may be on it’s way to the horizon. As an additional bonus, the marketing potential for a creative, lucrative small business owner is tremendous.
Being the big dog on the playground, it’s inevitable that you’ll step on some toes. It appears that in the most recent sense, Google has stepped on the European Union’s toes.
The European Union’s antitrust chief said Wednesday he is looking “very carefully” at allegations that Google Inc. unfairly demotes rivals’ sites in search results.
Using language such as: “importance of search to a competitive online marketplace.” Almunia accepted the argument from Google that with it’s size online, and far reaching strength, it’s difficult to behave at times in a dynamic market as the internet. With a store front active 24/7/365, when a company has worked to place itself at the top of the game, sometimes the little guys can be knocked about unknowingly.
The inquiry was launched however, due to Googles recent aquisition of the travel network ITA, an online booking agency. Two EU based comparison sites complained to the union that they were ranked lower in the SERPs, because the are competitors. And seeing as how higher ranking leads to higher search volumes, the EU may have a case. The algorithms are all programmed, with no human interaction within the SERPs, so in the end, the bottom may fall out of the case.