There’s a new Google test which has caught fire on the web discussions over the weekend. Google has been running a test algorithm segment which displays your search as you type. Dynamically updating the page as you add, change, or remove your query.
The assumption is that the test has been rolled out to those with only a very high speed connection, as the nature of the results being delivered is unknown. It may be from a cached, prefetch server based on your previous searches, but it also may be entirely and completely dynamic in nature. Automatically fetching the results as you add a term.
Couple this recent test, in with the article decrying that Google is set to allow domain dominance on a search page, it will change the landscape of the SEO game somewhat.
Today we’ve launched a change to our ranking algorithm that will make it much easier for users to find a large number of results from a single site. For queries that indicate a strong user interest in a particular domain, like [exhibitions at amnh], we’ll now show more results from the relevant site:Prior to today’s change, only two results from www.amnh.org would have appeared for this query. Now, we determine that the user is likely interested in the Museum of Natural History’s website, so seven results from the amnh.org domain appear. Since the user is looking for exhibitions at the museum, it’s far more likely that they’ll find what they’re looking for, faster. The last few results for this query are from other sites, preserving some diversity in the results.
This change could mean the difference in small business SEO, and will definitely encourage niche marketing campaigns. So it’s time to put on your creative thinking caps, hash out the creative copyright for your clients, and be ready to push for the niche search terms.
From a PR perspective though, it’s an interesting twist from Norvig’s comment earlier about wanting more diversity in search results. In having the second result to be as “different” from the first as possible to encourage diversity.
It’s not an unsual method of finding a service or product, you ask your friends and family for their opinions. It helps you form your own preliminary opinion, and with a few questions in mind you go for it. The trend however is shifting with the realization that word of mouth, has been changing more into world of mouth.
In a new Cone Inc. report indicates that consumers don’t take word of mouth as gospel when making their decisions. Eighty-one percent of respondents agreed with the statement,
“After getting a recommendation about a product or service I may want to purchase, I go online to do additional research about that product or service before deciding whether to purchase it.”
One of the surprising finds, was the disproval of the thought that bad news travels faster than good news. Online, the sway power of good reviews and news of a product or service was proven more potent than bad news and reviews. Only 68 percent of respondants admitted to changing their minds based on bad reviews, where as 80 percent agreed, positive reviews found online solidified their decisions of a recommended product or service.
These numbers in mind, it’s worth noting that while search engine optimization and search engine marketing are incredibly important to your business, it’s just as important to focus some of your attention to the social interaction of your client base. Whether it’s having a submitted question and answer form, a Twitter feed where client concerns can be addressed, or a Facebook page and wall, it’s worth the time to put in direct client interaction. Face time with your customers is still paramount in the digital world, and as the numbers from the Cone Inc. report shows, having a positive image online will help you immensely.
Another D-Day is looming on the horizon, and website owners are going to be learning another step to the SEO dance. It’s been in the works for the last while, the Yahoo-Bing search results merger, and in a recent press release from Bing, the proverbial trigger was pulled.
For webmasters, it’s important to be familiar with how the Bing crawler interacts with your site. After the full algorithmic transition is complete, you only need to optimize for one crawler (Bing), as we will provide Yahoo! with results from our index.
All of the little tricks, optimizations and tweaks that we’ve learned over the last year, can be trimmed down to the Bing bones as it were. In other words, don’t be surprised if your site shuffles and changes in ranking on Bing abd Yahoo, depending on which secondary search you work with.
You can find the entire press release issued, from their senior VP of their online services division, here.
Well, under a new patent which was approved this past week, Google will have an idea just what you do like to point at. The patent, titled “System and method for modulating search relevancy using pointer activity monitoring” was filed in 2005, and granted this week. The patent is described as a system for monitoring the movements of a user controlled mouse pointer in a web browser, identifying when the pointer movies into a predefined region and when it moves out of said region.
So basically you can think of it as using a hotbox link area for an image, or a div tag in CSS. Google can assign an area for analysis on their SERPs page, and track where searchers mouse moves. What type of information, and how it could possibly be applied in the realm of SEO is still to be determined. It could however, give Google a better understanding as to how well a SERPs page comprised of blended (both paid and organic results) results fares.
And, in the realm of satire, these headers are from a live website, who noticed Google flagged their website as a possible spam site. Internet cookies for those who can see what’s wrong. Who said Google doesn’t read meta tags?
<meta name=”author” content=”" />
<meta name=”alexa” content=”100″></meta>
<meta name=”googlebot” content=”noodp” />
<meta name=”pagerank™” content=”10″></meta>
<meta name=”revisit” content=”2 days”></meta>
<meta name=”revisit-after” content=”2 days”></meta>
<meta name=”robots” content=”all, index, follow”></meta>
<meta name=”distribution” content=”global” />
<meta name=”rating” content=”general” />
<meta name=”resourse-type” content=”documents” />
<meta name=”serps” content=”1, 2, 3, 10, 11, ATF”></meta>
<meta name=”relevance” content=”high” />
There are many different Search Engine Optimization ideas, and the influences which affect visibility on the SERPs. One such idea embraces the notion that search engines like fresh content. Trends shift and evlove at times revolving around a guy told a guy, who over heard it at a search expo and posted it to his Twitter account. And in the end, the priciple behind the idea can get lost.
Adding fresh content can means different things to different people and firms, the watered down version you’ll tend to hear from SEO “experts” is that search engines like websites that change their content often. The idea which the logic is based on is that changing content on page will attract bots more often and somehow improve visibility.
To expound a little on the point of fresh content, changing your pages, updating your news and story feeds for visitors is a pro-active measure to retaining that visitor base. Managing, maintaining, and optimization the links to that information for the bots is a bonus point as well.
Changing your text and content of your website on a regular basis however, is a good example of the loss of meaning of a best practice measure of SEO.
Real fresh content is derived from the purpose of adding new pages to your website on a regular basis. Adding new pages, optimizing the linking structure to them, and providing well written, compelling content for visitors and bots alike. This is the definition of fresh content to yours and your clients websites. The higher quality the content, the more proficient the optimization of the content, will draw bots and visitors alike.
Of the billions of webpages to be found online, of the millions upon millions of pages on search engine optimization alone, always bear in mind the trickle down effect.
Earlier in the week Facebooks own version of SEO – social engine optimization, to turn a phrase, lit up in the newsworld as their version of tackling Google. Seeing however, that the idea is powered somewhat, by users liking a page, it doesn’t seem to have any cards on the search giant. That doesn’t mean however, that the idea shouldn’t be ignored; social optimization is just as important to your business provided you have the Facebook and Twitter accounts.
The average internet user is already notorious for fast browsing and merely scanning content by nature. Add into the mix, the chaos of social media, and the attention span for the content in front of them drops again.
Creating compelling, relevant, and provoking content is a major key to success in gaining a high amount of links, votes, and traffic to your content. Not forgetting however, style and structure for your content, is a major factor to being successful in social media. We’ll go over just a few basic points in terms of social optimization, to help your pages receive the “Like” that you desire.
Try using shorter sentences – Writing your most relevant, compelling, attractive information in short, informational phrases can be the turning point in keeping a user from clicking that back button. Keeping your key phrases and termsin shorter, easier to digest sentences and paragraphs allows the searcher to quickly determine that you meet their requirements on the social web.
Table of Contents – If you’ve shortened your information as much as possible, and still have miles upon miles of text, construct a table of contents and with anchors within. This allows for quick navigation to interesting sections within, and provides that extra usability that can be very helpful.
Bullet points and Lists – Breaking your more complex portions into bullet points or lists allows for quick and simple reading. Breaking down your page down in such a fashion also lends to easy linking within the page and site.
Photos and images – Using amazing imagery within your pages helps to draw visitors to your page, while your written content is designed to keep it. “A picture is worth 1000 words” afterall. Just be sure your images, are relevant to the content.
Social media is here to stay, and it’s best to get used to the idea. Your pages and content need to be attractive, intelligent, and compelling with their first impression. Taking the time to be sure that your social optimization is up to par is well worth the time investment. Building a loyal visitor and fan base in the social media sector of the web, will ensure long term viability in the marketplace.
The third time’s the charm, the idiom basically dictating that the third attempt at something is likely to yield the results desired, apparently didn’t sit well with Google.
The algorithm change which happened April 28th – May 3rd, nicknamed Mayday, showed a shift in long tailed search results. It’s been hashed, and rehashed all over the web, but basically put, it was done on purpose, it was done for quality purposes, and it’s completely algorithmical; no human interaction at all.
Since the Mayday change, there have been 3 more seeming drastic shifts in the SERPs, with some seeing changes as little as 10% shift, to as much as an 80% drop in results. Reports of spam sites taking front page placements, poorly written, poorly constructed, and ad filled pages replacing formerly authoritive, professional sites. The shifts being discussed have all been around long tail returns, with the shorter queries having only slightly adjusted.
After all the ideas have been discussed, the tin foil hat theories disected, there emerges one common, agreed upon result: the first week of July will be a doozy.
Google, the king of the web, the go to guys in the realm of search, and the players holding all the cards, was put to their own test. Just some of the self imposed questions for Google:
How many of Google’s web pages use a descriptive title tag? Do we use description meta tags? Heading tags? While we always try to focus on the user, could our products use an SEO tune up?
So how did they do? The report was published on their own webmaster blog, but it will be just a couple of the more interesting points we’ll touch on. Google always works for the user, to improve the users experience. They don’t work for their own ends, on how to rank or be found online, Googling is a verb now, so it’s not hard to find them. Some of their fixes they found which were needed included 404s, broken links, URLs were confusing in some places, and better titles and description tags for their pages.
As described in their own SEO report card :
Google’s SEO Report Card aims to identify potential areas for improvement in Google’s product pages. If
implemented, these improvements could:
• help users find our pages more easily in search engines
• fix bugs that annoy visitors and hurt our pages’ performance in search engines
• serve as a good model for outside webmasters and companies
They took 100 pages of different Google products, and ranked them following common SEO strategies. They found interesting numbers such as, 33% of their products had descriptive meta tags. Only 1/3 of their pages had proper snippet text, terribly low number for the company who relies on that tag, in order to pass on the summary of a page to a user.
They found that only 10% of their pages had proper titles, in length and format. They have some confusing URLs which could be redirected for ease of use, and that nearly half of their images alt text needed improvement. 301s, 404,s and proper tags missing oh my!
The entire report is an insightful read, and it’s plain to see that even when you’re the king of search, you can still make mistakes from time to time.
You can read the report for yourself here I recommend the read.
Recently Google produced a short video where in they dug into a few sites with checking out their optimization. People submitted sites for review, and Matt Cutts showed just a few of the things that are stand out points in the SEO game.
In essence, they really just covered the best practices rules of SEO, but as always it was great to hear them.
Be sure to use text on your page - Google Bot can’t see images the way a user can, so you need to have text on your page. Text also helps with your content and about selecting your keywords to attact searchers. Keeping your content original, and with the option of having user generated content as filler is also good.
Focus is better - If you have multiple sites, it’s better in the long run for you to focus on your main site as opposed to any others so as not to dilute your time and/or skills. Focusing your time, and energy, into building, SEO’ing, and maintaining a single site will pay off better than trying to spread your self out.
Titles, Metas, and Layout - All are very important, the title tag was a topic point multiple times within the presentation. It seems a lot of companies construct, or implement poor titles on their sites and pages within their sites. A good practice is for your titles to include the keyword(s) for that page. The meta keywords tag, while not indexed, is not ignored, and the description tag is important, so be sure not to blank it. If your page has no meta description tag, Google will do it’s best to try and find a description from your page text to use on the SERPs, but, no one knows your content like you do, right? And layout, if you’re going to build a site and promote it to your patrons, be mindful of it’s layout. If you place the quality content too far down the page, you run the risk of people using that back button before they discover just how valuable your website truly is.
If you use a Content Management System - Keep it up to date!! They spent a good deal of time on this topic as well, as CMS are prone to attack. The more popular one is, the higher the inherent risk. By keeping your CMS up to date, you lessen the risk of successful hacks on your website.
Companies in the UK have ramped up their investment in paid search and SEO this year as the economy has emerged tentatively from recession, according to research published today.
The proportion of companies saying they plan to increase spending on search engine optimization (SEO or natural search) over the next year has increased to 60% from 55%last year.
Pay-per-click search advertising is also buoyant, with 52% of companies planning to raise their budgets for paid search over the next 12 months compared to only 45% who said they would do so in 2009.
While the majority of companies are increasing their search budgets, only 14% said they are planning to decrease their paid search spending and only 4% plan to spend less on SEO.