Newest Canadian start-ups have enough to worry about without having to deal with the complexities of establishing an online marketing strategy. This past week they got some help. Google Inc. offered them some help, I’m not going to say I told you so Yellow Pages but, is this the start?
The newest service to Canadian business owners, having been available in the United States since April 2009, the search giant unveiled a Canada-focused version of Google for Advertisers.
The new site, much like its American equivalent, only uses data tailored for advertising in the Canadian market. It is designed to make the various Google advertising features more easily accessible and understandable.
“It covers all of the product offerings that we have in Canada and how they all work together to create a full marketing opportunity,” said Andrew Swartz, spokesperson for Google Canada.
“For small and medium-sized businesses who may not know how to get started using Ad Words or our other products, this is a resource for them to see what they’re able to do using Google to help develop their online marketing strategy.
Some of the more useful features of the new site include insightful statistics to see what Canadian consumers are searching for, what sites they’re visiting and the specific search terms they use to find specific businesses.
Google Analytics and the Google Website Optimizer allows businesses to track their return on investment by analyzing the relative success of certain marketing initiatives and website designs.
Here at Fresh Traffic Group we are no strangers to Google and how the Big G works, after all they did buy our other company, when you need expert advice or guidance on Google products, seo, online marketing feel free to gives a call.
(If you thought YP had a partnership with Google in Canada to sell Adwords, Wrong! They have a strategic agreement like the rest of us, Oh no how are the sales staff going to sell online now?)
It was only a matter of time really. Previously the DoJ in the US was looking at the data Google had collected during it’s Street View runs, and was holding it’s cards close to it’s chest. Some of the individual states however, have taken their own road, led by Connecticut AG Richard Blumenthal.
Blumenthal says 38 states and the District of Columbia will be participating in the investigation, with Florida, Illinois, Kentucky, Massachusetts, Missouri, and Texas on the executive committee. Other states joining the coalition include New York, Mississippi, Vermont, Nebraska, North Carolina, Oregon, Washington, Kansas, Montana and Rhode Island.
The whole mess kicked off when German privacy concerns launched a probe into the Google Street View collection practices. Discovering that the software was not only picking up open and unsecured wi-fi points, but was also collecting any data which was passing along the connection. Blumenthal main point of contention, is that the answers Google gives, only serves to present more questions than are answered. When the wi-fi software was found in the Street View program for example, it wasn’t known that there was tangible, usable data contained within it. Oops?
It’s being asked whether or not the specific persons involved with implementing the code snippet will be identified, and how is it that Google wasn’t aware what the code was fully capable of. It seems rather far fetched that it would have gone completely unnoticed.
On a lighter note..
Almost a Facebook Nation..
As the overseer of the “third largest country” in the world, it’s no surprise that Mark Zuckerberg, Facebook CEO, denies signing over an 84% stake in the company for a mere $1000.
After having a brief commemoration of the site turning over the 500 million mark, Zuckerberg admitted that the privacy policies on the site were handled poorly.
“We’ve made mistakes for sure, I think they’re a lot better.”
When pressed as to why personal information isn’t automatically set to full private, the answer was basic, Facebook is set up in a way to enable people to share. Adding however, that ideally having certain information always private would be a step in the right direction.
Yahoo is up, Google is down, Bing is in the mix and on average Facebook isn’t trusted. At least, if you believe the numbers based on American Consumer Satisfaction Index (ACSI), which tracks general consumer satisfaction levels with websites. This was the first time social media was included in the survey.
What was found on average, was that social media platforms returned an average rating of 7/10, a fair step below portals and search engines and news and information sites.
The survey looked at Wikipedia, YouTube, Facebook, MySpace and “all others.” Twitter wasn’t included apparently because so much of Twitter’s access comes from third party clients. As mentioned the category average was 70. Facebook scored a 64, while YouTube scored a 73. The generic “all others” received a 72 mysteriously.
In the laundry list of complaints about Facebook, privacy and security were prominent concerns. Also included in the mix, but not limited too were, advertising, the constant and unpredictable interface changes, spam, annoying applications with constant notifications, and functionality. Age was a variable in the equation, as it was found that older people rated Facebook lower, while the younger, more prevalent population of the website listed less concern. As of late however, the largest growing segment on Facebook is an older generation, so according to the numbers, Facebook may want to take a look at how the ship is being steered.
The ACSI numbers aren’t concrete in the sense that they can make, or break businesses, they have however proven to be a metric worth considering. The report in it’s entirety, is an all encompassing baseline which can possibly identify improvements which can be made for your consumers.
Last week a piece was written in the New York Times, which suggested heavily that Google and it’s algorithm needs to be taken in hand, and monitored. Using examples like financial incentives, handling 60%+ of the web queries worldwide and how Google can break small business owners with a shift in ranking; having the government decide what Google can, and can’t change within the algorithm was pressed. Make the algo public, let the government decide what tweaks can or can’t be made, and to determine in the end, what’s relevant for users.
Needless to say, it wasn’t taken too lightly. Danny Sullivan wrote an entertaining response, using the verbage from the article nearly word for word, replacign Wall Street Journal for Google. It’s an entertaining article to read, I suggest taking the time. One of the more enjoyable points for me, he compares in the end, the WSJ to Google, and at one point even has a comparison of the business’ bias in transparency concerns.
Google will list EVERY site that applies for “coverage” unlike the New York Times, which regularly ignores potential stories
If Google blocks a site for violating its guidelines, it alerts many of them. The New York Times alerts no one
Google provides an entire Google Webmaster Central area with tools and tips to encourage people to show up better in Google; the New York Times offers nothing even remotely similar
Google constantly speaks at search marketing and other events to answer questions about how they list sites and how to improve coverage; I’m pretty sure the New York Times devotes far less effort in this area
Google is constantly giving interviews about its algorithm, along with providing regular videos about its process or blogging about important changes, such as when site speed was introduced as a factor earlier this year.
June 2007, Google allowed New York Times reporter Saul Hansell into one of its search quality meetings, where some of the core foundations of the algorithms are discussed.
Who’s article rings of more truth to you?
Well, under a new patent which was approved this past week, Google will have an idea just what you do like to point at. The patent, titled “System and method for modulating search relevancy using pointer activity monitoring” was filed in 2005, and granted this week. The patent is described as a system for monitoring the movements of a user controlled mouse pointer in a web browser, identifying when the pointer movies into a predefined region and when it moves out of said region.
So basically you can think of it as using a hotbox link area for an image, or a div tag in CSS. Google can assign an area for analysis on their SERPs page, and track where searchers mouse moves. What type of information, and how it could possibly be applied in the realm of SEO is still to be determined. It could however, give Google a better understanding as to how well a SERPs page comprised of blended (both paid and organic results) results fares.
And, in the realm of satire, these headers are from a live website, who noticed Google flagged their website as a possible spam site. Internet cookies for those who can see what’s wrong. Who said Google doesn’t read meta tags?
<meta name=”author” content=”" />
<meta name=”alexa” content=”100″></meta>
<meta name=”googlebot” content=”noodp” />
<meta name=”pagerank™” content=”10″></meta>
<meta name=”revisit” content=”2 days”></meta>
<meta name=”revisit-after” content=”2 days”></meta>
<meta name=”robots” content=”all, index, follow”></meta>
<meta name=”distribution” content=”global” />
<meta name=”rating” content=”general” />
<meta name=”resourse-type” content=”documents” />
<meta name=”serps” content=”1, 2, 3, 10, 11, ATF”></meta>
<meta name=”relevance” content=”high” />
Google’s recently accounced it’s “build your own app” program for the everyday person who’d like customize their Android powered phone. For free. Apps that are developed with the platform can be listed in the android store with a nominal registration fee. Some have said this will lead to an influx of poorly designed apps, and others have used the argument that this opens up people to a new realm of spam.
Just to add to the mix, Windows has decided to toss their hat into the ring as well. On the expected arrival of the Windows Phone 7 platform, Microsoft has launched their own suite of developer tools.
A brief timeline from the Windows Phone Developer Blog:
Feb 2010 – Windows Phone 7 was unveiled at Mobile World Congress in Barcelona
Mar 2010 – The application platform was unveiled at MIX 10 in Las Vegas. With that, we had the first CTP of the Windows Phone Developer Tools.
Apr 2010 – The tools received an updated, and the CTP Refresh shipped.
Jun 2010 – Windows Phone Marketplace details unveiled at TechEd 2010.
July 2010 – Beta release of Windows Phone Developer Tools, and the preview developer phones start shipping to ISVs
The iPhone has their apps, with quality guidelines and store and what not. With an SDK which isn’t terribly difficult to learn, but made for the technically inclined. Versus, the newest Android developer software, which allows virtually anyone the ability to create their own custom apps for their Android powered phone. And now the Microsoft version, allowing further customization of the Windows Phone 7 powered handsets. To add a little cream to their offering, free classes on how to fully utilize the Microsoft software are available. The premise:
It will provide developers a jump start for developing Windows Phone 7 applications.
The dates for these course sessions are:
July 20 – 8am: Session One: Getting Started with Microsoft Windows Phone and Silverlight
July 20 – 1pm: Session Two: Programming Game Applications with XNA
July 22 – 8am: Session Three: Programming Applications with Silverlight
July 22 – 1pm: Session Four: Review and Wrap Up
This is a big milestone for everyone involved in Windows Phone 7 – inside and outside of Microsoft – and we hope you share in our excitement. With the Beta release of the tools, developers can build apps with a “ship it” mentality.
So now it’s turned into much more than just a handset battle, the software and apps powered by that software have entered the fray. With the power to be able to completely customize your cell phones functions and uses, to cater to your needs, the way of the paid app development may be on it’s way to the horizon. As an additional bonus, the marketing potential for a creative, lucrative small business owner is tremendous.
Being the big dog on the playground, it’s inevitable that you’ll step on some toes. It appears that in the most recent sense, Google has stepped on the European Union’s toes.
The European Union’s antitrust chief said Wednesday he is looking “very carefully” at allegations that Google Inc. unfairly demotes rivals’ sites in search results.
Using language such as: “importance of search to a competitive online marketplace.” Almunia accepted the argument from Google that with it’s size online, and far reaching strength, it’s difficult to behave at times in a dynamic market as the internet. With a store front active 24/7/365, when a company has worked to place itself at the top of the game, sometimes the little guys can be knocked about unknowingly.
The inquiry was launched however, due to Googles recent aquisition of the travel network ITA, an online booking agency. Two EU based comparison sites complained to the union that they were ranked lower in the SERPs, because the are competitors. And seeing as how higher ranking leads to higher search volumes, the EU may have a case. The algorithms are all programmed, with no human interaction within the SERPs, so in the end, the bottom may fall out of the case.
There are many different Search Engine Optimization ideas, and the influences which affect visibility on the SERPs. One such idea embraces the notion that search engines like fresh content. Trends shift and evlove at times revolving around a guy told a guy, who over heard it at a search expo and posted it to his Twitter account. And in the end, the priciple behind the idea can get lost.
Adding fresh content can means different things to different people and firms, the watered down version you’ll tend to hear from SEO “experts” is that search engines like websites that change their content often. The idea which the logic is based on is that changing content on page will attract bots more often and somehow improve visibility.
To expound a little on the point of fresh content, changing your pages, updating your news and story feeds for visitors is a pro-active measure to retaining that visitor base. Managing, maintaining, and optimization the links to that information for the bots is a bonus point as well.
Changing your text and content of your website on a regular basis however, is a good example of the loss of meaning of a best practice measure of SEO.
Real fresh content is derived from the purpose of adding new pages to your website on a regular basis. Adding new pages, optimizing the linking structure to them, and providing well written, compelling content for visitors and bots alike. This is the definition of fresh content to yours and your clients websites. The higher quality the content, the more proficient the optimization of the content, will draw bots and visitors alike.
Of the billions of webpages to be found online, of the millions upon millions of pages on search engine optimization alone, always bear in mind the trickle down effect.
There’s a kaleidoscope of steps, styles, methods and opinions about the right way to implement search engine optimization (SEO) for your site. But, there are a few points which are generally accepted. Points such as:
- Quality content is extremely important
- Working actively to accrue quality links and backlinks is also paramount
- Apply K.I.S.S. to your site
One of the most overlooked steps, which should be mentioned more often is having an accurate, up to date sitemap for your website. You can think of a sitemap as the formal written index of your web pages. Up until recently, multiple sitemaps were needed if you desired to have all of your content listed easily. Be it images, text, videos, your geo location, and a news section. An individual sitemap for each was required to speed up the indexing process of those assets. Google introduced the XML sitemap 5 years ago, and have just recently changed the game a little.
Instead of multiple sitemaps, webmasters can now submit one XML sitemap to include all of your websites features. From Google:
With the increasing number of specialized formats, we’d like to make it easier for you by supporting Sitemaps that can include multiple content types in the same file.
<?xml version=”1.0″ encoding=”UTF-8″?>
<video:title>Grilling tofu for summer</video:title>
The idea of the inclusion for multiple content types within one sitemap was to streamline the entire process for webmasters and their clients.
Earlier in the week Facebooks own version of SEO – social engine optimization, to turn a phrase, lit up in the newsworld as their version of tackling Google. Seeing however, that the idea is powered somewhat, by users liking a page, it doesn’t seem to have any cards on the search giant. That doesn’t mean however, that the idea shouldn’t be ignored; social optimization is just as important to your business provided you have the Facebook and Twitter accounts.
The average internet user is already notorious for fast browsing and merely scanning content by nature. Add into the mix, the chaos of social media, and the attention span for the content in front of them drops again.
Creating compelling, relevant, and provoking content is a major key to success in gaining a high amount of links, votes, and traffic to your content. Not forgetting however, style and structure for your content, is a major factor to being successful in social media. We’ll go over just a few basic points in terms of social optimization, to help your pages receive the “Like” that you desire.
Try using shorter sentences – Writing your most relevant, compelling, attractive information in short, informational phrases can be the turning point in keeping a user from clicking that back button. Keeping your key phrases and termsin shorter, easier to digest sentences and paragraphs allows the searcher to quickly determine that you meet their requirements on the social web.
Table of Contents – If you’ve shortened your information as much as possible, and still have miles upon miles of text, construct a table of contents and with anchors within. This allows for quick navigation to interesting sections within, and provides that extra usability that can be very helpful.
Bullet points and Lists – Breaking your more complex portions into bullet points or lists allows for quick and simple reading. Breaking down your page down in such a fashion also lends to easy linking within the page and site.
Photos and images – Using amazing imagery within your pages helps to draw visitors to your page, while your written content is designed to keep it. “A picture is worth 1000 words” afterall. Just be sure your images, are relevant to the content.
Social media is here to stay, and it’s best to get used to the idea. Your pages and content need to be attractive, intelligent, and compelling with their first impression. Taking the time to be sure that your social optimization is up to par is well worth the time investment. Building a loyal visitor and fan base in the social media sector of the web, will ensure long term viability in the marketplace.