Last week a piece was written in the New York Times, which suggested heavily that Google and it’s algorithm needs to be taken in hand, and monitored. Using examples like financial incentives, handling 60%+ of the web queries worldwide and how Google can break small business owners with a shift in ranking; having the government decide what Google can, and can’t change within the algorithm was pressed. Make the algo public, let the government decide what tweaks can or can’t be made, and to determine in the end, what’s relevant for users.
Needless to say, it wasn’t taken too lightly. Danny Sullivan wrote an entertaining response, using the verbage from the article nearly word for word, replacign Wall Street Journal for Google. It’s an entertaining article to read, I suggest taking the time. One of the more enjoyable points for me, he compares in the end, the WSJ to Google, and at one point even has a comparison of the business’ bias in transparency concerns.
Google will list EVERY site that applies for “coverage” unlike the New York Times, which regularly ignores potential stories
If Google blocks a site for violating its guidelines, it alerts many of them. The New York Times alerts no one
Google provides an entire Google Webmaster Central area with tools and tips to encourage people to show up better in Google; the New York Times offers nothing even remotely similar
Google constantly speaks at search marketing and other events to answer questions about how they list sites and how to improve coverage; I’m pretty sure the New York Times devotes far less effort in this area
Google is constantly giving interviews about its algorithm, along with providing regular videos about its process or blogging about important changes, such as when site speed was introduced as a factor earlier this year.
June 2007, Google allowed New York Times reporter Saul Hansell into one of its search quality meetings, where some of the core foundations of the algorithms are discussed.
Who’s article rings of more truth to you?
Well, under a new patent which was approved this past week, Google will have an idea just what you do like to point at. The patent, titled “System and method for modulating search relevancy using pointer activity monitoring” was filed in 2005, and granted this week. The patent is described as a system for monitoring the movements of a user controlled mouse pointer in a web browser, identifying when the pointer movies into a predefined region and when it moves out of said region.
So basically you can think of it as using a hotbox link area for an image, or a div tag in CSS. Google can assign an area for analysis on their SERPs page, and track where searchers mouse moves. What type of information, and how it could possibly be applied in the realm of SEO is still to be determined. It could however, give Google a better understanding as to how well a SERPs page comprised of blended (both paid and organic results) results fares.
And, in the realm of satire, these headers are from a live website, who noticed Google flagged their website as a possible spam site. Internet cookies for those who can see what’s wrong. Who said Google doesn’t read meta tags?
<meta name=”author” content=”” />
<meta name=”alexa” content=”100″></meta>
<meta name=”googlebot” content=”noodp” />
<meta name=”pagerank™” content=”10″></meta>
<meta name=”revisit” content=”2 days”></meta>
<meta name=”revisit-after” content=”2 days”></meta>
<meta name=”robots” content=”all, index, follow”></meta>
<meta name=”distribution” content=”global” />
<meta name=”rating” content=”general” />
<meta name=”resourse-type” content=”documents” />
<meta name=”serps” content=”1, 2, 3, 10, 11, ATF”></meta>
<meta name=”relevance” content=”high” />
Google’s recently accounced it’s “build your own app” program for the everyday person who’d like customize their Android powered phone. For free. Apps that are developed with the platform can be listed in the android store with a nominal registration fee. Some have said this will lead to an influx of poorly designed apps, and others have used the argument that this opens up people to a new realm of spam.
Just to add to the mix, Windows has decided to toss their hat into the ring as well. On the expected arrival of the Windows Phone 7 platform, Microsoft has launched their own suite of developer tools.
A brief timeline from the Windows Phone Developer Blog:
Feb 2010 – Windows Phone 7 was unveiled at Mobile World Congress in Barcelona
Mar 2010 – The application platform was unveiled at MIX 10 in Las Vegas. With that, we had the first CTP of the Windows Phone Developer Tools.
Apr 2010 – The tools received an updated, and the CTP Refresh shipped.
Jun 2010 – Windows Phone Marketplace details unveiled at TechEd 2010.
July 2010 – Beta release of Windows Phone Developer Tools, and the preview developer phones start shipping to ISVs
The iPhone has their apps, with quality guidelines and store and what not. With an SDK which isn’t terribly difficult to learn, but made for the technically inclined. Versus, the newest Android developer software, which allows virtually anyone the ability to create their own custom apps for their Android powered phone. And now the Microsoft version, allowing further customization of the Windows Phone 7 powered handsets. To add a little cream to their offering, free classes on how to fully utilize the Microsoft software are available. The premise:
It will provide developers a jump start for developing Windows Phone 7 applications.
The dates for these course sessions are:
July 20 – 8am: Session One: Getting Started with Microsoft Windows Phone and Silverlight
July 20 – 1pm: Session Two: Programming Game Applications with XNA
July 22 – 8am: Session Three: Programming Applications with Silverlight
July 22 – 1pm: Session Four: Review and Wrap Up
This is a big milestone for everyone involved in Windows Phone 7 – inside and outside of Microsoft – and we hope you share in our excitement. With the Beta release of the tools, developers can build apps with a “ship it” mentality.
So now it’s turned into much more than just a handset battle, the software and apps powered by that software have entered the fray. With the power to be able to completely customize your cell phones functions and uses, to cater to your needs, the way of the paid app development may be on it’s way to the horizon. As an additional bonus, the marketing potential for a creative, lucrative small business owner is tremendous.
Being the big dog on the playground, it’s inevitable that you’ll step on some toes. It appears that in the most recent sense, Google has stepped on the European Union’s toes.
The European Union’s antitrust chief said Wednesday he is looking “very carefully” at allegations that Google Inc. unfairly demotes rivals’ sites in search results.
Using language such as: “importance of search to a competitive online marketplace.” Almunia accepted the argument from Google that with it’s size online, and far reaching strength, it’s difficult to behave at times in a dynamic market as the internet. With a store front active 24/7/365, when a company has worked to place itself at the top of the game, sometimes the little guys can be knocked about unknowingly.
The inquiry was launched however, due to Googles recent aquisition of the travel network ITA, an online booking agency. Two EU based comparison sites complained to the union that they were ranked lower in the SERPs, because the are competitors. And seeing as how higher ranking leads to higher search volumes, the EU may have a case. The algorithms are all programmed, with no human interaction within the SERPs, so in the end, the bottom may fall out of the case.
There are many different Search Engine Optimization ideas, and the influences which affect visibility on the SERPs. One such idea embraces the notion that search engines like fresh content. Trends shift and evlove at times revolving around a guy told a guy, who over heard it at a search expo and posted it to his Twitter account. And in the end, the priciple behind the idea can get lost.
Adding fresh content can means different things to different people and firms, the watered down version you’ll tend to hear from SEO “experts” is that search engines like websites that change their content often. The idea which the logic is based on is that changing content on page will attract bots more often and somehow improve visibility.
To expound a little on the point of fresh content, changing your pages, updating your news and story feeds for visitors is a pro-active measure to retaining that visitor base. Managing, maintaining, and optimization the links to that information for the bots is a bonus point as well.
Changing your text and content of your website on a regular basis however, is a good example of the loss of meaning of a best practice measure of SEO.
Real fresh content is derived from the purpose of adding new pages to your website on a regular basis. Adding new pages, optimizing the linking structure to them, and providing well written, compelling content for visitors and bots alike. This is the definition of fresh content to yours and your clients websites. The higher quality the content, the more proficient the optimization of the content, will draw bots and visitors alike.
Of the billions of webpages to be found online, of the millions upon millions of pages on search engine optimization alone, always bear in mind the trickle down effect.
There’s a kaleidoscope of steps, styles, methods and opinions about the right way to implement search engine optimization (SEO) for your site. But, there are a few points which are generally accepted. Points such as:
- Quality content is extremely important
- Working actively to accrue quality links and backlinks is also paramount
- Apply K.I.S.S. to your site
One of the most overlooked steps, which should be mentioned more often is having an accurate, up to date sitemap for your website. You can think of a sitemap as the formal written index of your web pages. Up until recently, multiple sitemaps were needed if you desired to have all of your content listed easily. Be it images, text, videos, your geo location, and a news section. An individual sitemap for each was required to speed up the indexing process of those assets. Google introduced the XML sitemap 5 years ago, and have just recently changed the game a little.
Instead of multiple sitemaps, webmasters can now submit one XML sitemap to include all of your websites features. From Google:
With the increasing number of specialized formats, we’d like to make it easier for you by supporting Sitemaps that can include multiple content types in the same file.
<?xml version=”1.0″ encoding=”UTF-8″?>
<video:title>Grilling tofu for summer</video:title>
The idea of the inclusion for multiple content types within one sitemap was to streamline the entire process for webmasters and their clients.
Earlier in the week Facebooks own version of SEO – social engine optimization, to turn a phrase, lit up in the newsworld as their version of tackling Google. Seeing however, that the idea is powered somewhat, by users liking a page, it doesn’t seem to have any cards on the search giant. That doesn’t mean however, that the idea shouldn’t be ignored; social optimization is just as important to your business provided you have the Facebook and Twitter accounts.
The average internet user is already notorious for fast browsing and merely scanning content by nature. Add into the mix, the chaos of social media, and the attention span for the content in front of them drops again.
Creating compelling, relevant, and provoking content is a major key to success in gaining a high amount of links, votes, and traffic to your content. Not forgetting however, style and structure for your content, is a major factor to being successful in social media. We’ll go over just a few basic points in terms of social optimization, to help your pages receive the “Like” that you desire.
Try using shorter sentences – Writing your most relevant, compelling, attractive information in short, informational phrases can be the turning point in keeping a user from clicking that back button. Keeping your key phrases and termsin shorter, easier to digest sentences and paragraphs allows the searcher to quickly determine that you meet their requirements on the social web.
Table of Contents – If you’ve shortened your information as much as possible, and still have miles upon miles of text, construct a table of contents and with anchors within. This allows for quick navigation to interesting sections within, and provides that extra usability that can be very helpful.
Bullet points and Lists – Breaking your more complex portions into bullet points or lists allows for quick and simple reading. Breaking down your page down in such a fashion also lends to easy linking within the page and site.
Photos and images – Using amazing imagery within your pages helps to draw visitors to your page, while your written content is designed to keep it. “A picture is worth 1000 words” afterall. Just be sure your images, are relevant to the content.
Social media is here to stay, and it’s best to get used to the idea. Your pages and content need to be attractive, intelligent, and compelling with their first impression. Taking the time to be sure that your social optimization is up to par is well worth the time investment. Building a loyal visitor and fan base in the social media sector of the web, will ensure long term viability in the marketplace.
The third time’s the charm, the idiom basically dictating that the third attempt at something is likely to yield the results desired, apparently didn’t sit well with Google.
The algorithm change which happened April 28th – May 3rd, nicknamed Mayday, showed a shift in long tailed search results. It’s been hashed, and rehashed all over the web, but basically put, it was done on purpose, it was done for quality purposes, and it’s completely algorithmical; no human interaction at all.
Since the Mayday change, there have been 3 more seeming drastic shifts in the SERPs, with some seeing changes as little as 10% shift, to as much as an 80% drop in results. Reports of spam sites taking front page placements, poorly written, poorly constructed, and ad filled pages replacing formerly authoritive, professional sites. The shifts being discussed have all been around long tail returns, with the shorter queries having only slightly adjusted.
After all the ideas have been discussed, the tin foil hat theories disected, there emerges one common, agreed upon result: the first week of July will be a doozy.
Google, the king of the web, the go to guys in the realm of search, and the players holding all the cards, was put to their own test. Just some of the self imposed questions for Google:
How many of Google’s web pages use a descriptive title tag? Do we use description meta tags? Heading tags? While we always try to focus on the user, could our products use an SEO tune up?
So how did they do? The report was published on their own webmaster blog, but it will be just a couple of the more interesting points we’ll touch on. Google always works for the user, to improve the users experience. They don’t work for their own ends, on how to rank or be found online, Googling is a verb now, so it’s not hard to find them. Some of their fixes they found which were needed included 404s, broken links, URLs were confusing in some places, and better titles and description tags for their pages.
As described in their own SEO report card :
Google’s SEO Report Card aims to identify potential areas for improvement in Google’s product pages. If
implemented, these improvements could:
• help users find our pages more easily in search engines
• fix bugs that annoy visitors and hurt our pages’ performance in search engines
• serve as a good model for outside webmasters and companies
They took 100 pages of different Google products, and ranked them following common SEO strategies. They found interesting numbers such as, 33% of their products had descriptive meta tags. Only 1/3 of their pages had proper snippet text, terribly low number for the company who relies on that tag, in order to pass on the summary of a page to a user.
They found that only 10% of their pages had proper titles, in length and format. They have some confusing URLs which could be redirected for ease of use, and that nearly half of their images alt text needed improvement. 301s, 404,s and proper tags missing oh my!
The entire report is an insightful read, and it’s plain to see that even when you’re the king of search, you can still make mistakes from time to time.
You can read the report for yourself here I recommend the read.
Recently Google produced a short video where in they dug into a few sites with checking out their optimization. People submitted sites for review, and Matt Cutts showed just a few of the things that are stand out points in the SEO game.
In essence, they really just covered the best practices rules of SEO, but as always it was great to hear them.
Be sure to use text on your page - Google Bot can’t see images the way a user can, so you need to have text on your page. Text also helps with your content and about selecting your keywords to attact searchers. Keeping your content original, and with the option of having user generated content as filler is also good.
Focus is better - If you have multiple sites, it’s better in the long run for you to focus on your main site as opposed to any others so as not to dilute your time and/or skills. Focusing your time, and energy, into building, SEO’ing, and maintaining a single site will pay off better than trying to spread your self out.
Titles, Metas, and Layout - All are very important, the title tag was a topic point multiple times within the presentation. It seems a lot of companies construct, or implement poor titles on their sites and pages within their sites. A good practice is for your titles to include the keyword(s) for that page. The meta keywords tag, while not indexed, is not ignored, and the description tag is important, so be sure not to blank it. If your page has no meta description tag, Google will do it’s best to try and find a description from your page text to use on the SERPs, but, no one knows your content like you do, right? And layout, if you’re going to build a site and promote it to your patrons, be mindful of it’s layout. If you place the quality content too far down the page, you run the risk of people using that back button before they discover just how valuable your website truly is.
If you use a Content Management System - Keep it up to date!! They spent a good deal of time on this topic as well, as CMS are prone to attack. The more popular one is, the higher the inherent risk. By keeping your CMS up to date, you lessen the risk of successful hacks on your website.