Tagged with " internet news"
With Google making their gaff and releasing their earnings numbers in the middle of the day as opposed to the end of day, it caused a bit of excitement. So much do in fact, that trading on their stock had to be halted, due to their earnings being lower than expected.
The market had already been aggressive with the stock, estimating positive growth in the company. With the final numbers coming in lower than what was expected, it caused the knee jerk reaction that the stock experienced. But just how is it, that one of the most powerful online properties failed to increase earnings when they picked up notable acquisitions like Motorola? Perhaps the answer isn’t as complex as it seems on the surface.
When it comes to search there is a handful of (viable) options for being found online, Google, Bing etc. But one of the avenues that mostly levels the search playing field is paid search, or PPC. Pay per click is almost the gear equalizer, as it’s limited to daily budget and doesn’t have any real bearing on age of domain or rely on heavy back linking strategies, you just need to write a better ad than the other guy. The issue we’ve been seeing in the last 8 months or so is the cost per click on client campaigns, previous costs ran in the 35 to 40 cent range where now we’re seeing increases to the 3 dollar plus range.
It makes it vastly difficult for anyone who doesn’t have a budget of several hundred dollars, equating to budgets of several thousand dollars per month. Short term gains are much more difficult for the mid to small business owner and who knows, maybe a direct correlation was their bottom line.
The myths surrounding SEO are many, everything ranging from what the algorithm contains, how to trick the engines to rank highly with no effort and everything in between. Just like all rumors, they have a beginning, and it seems someone is trying to start a new on on the Webmaster World forums.
A site owner who has ended up being ‘Penguinized’ on his site as he put it, has become overly paranoid about any and all content on his site. He has basically decided, that all user generated content is a potential red flag for spam on his site, and as a result had currently removed/disabled all of the content. The (potential) birth of the SEO myth that user generated content, comments, forums, or other ways to directly interact with your customers, can lead to a Google applying a penalty to your site.
Without any confusion, user generated content will notlead to any penalty to be levied against your site.
It’s topics like this one started on active forums and blogs that lead to a great deal of confusion in the search world. It might seem like an inoccuous discussion taking place in a proper forum by someone looking for information, but with the way the discussion was handled it has the potential to lead to long lasting repurcussions. Because what often ends up happening is someone new to the search world finds these posts, and begins to believe them and the myth continues, changes, and unfortunately grows.
And finally after being patient for the last few months, site owners with a Google webmaster account have the final say over how links to your site are treated. From the Google webmaster blog:
Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue.
This is going to be a great tool to add into your toolkit if you use webmaster tools directly, and if you don’t you should check that your site manager is keen on what the tool can actually do for you. A quick rundown of how links to your site affect you – you create content, and if it’s unique content that is relevant to your niche then users will generate a link to that page. These links are used as a factor when determining relevance in the results pages for the terms you may wish to rank for, and if you’ve created great content then the links will follow. More links is used as a measure of relevance, so the more the better. Well there’s a downside to links and that happens if you have too many ‘unnatural’ links pointing to your site. That would be having links from a plumbing site pointing back to your website on shoe sales, the two topics are irrelevant to each other. The recourse you had as a site owner in this instance was to contact the website that posted the link to your site and asked to have it removed, it was then out of your hands and left for them to deal with, and until it was you could be handed a stiff penalty from search engines.
The problem with that scenario is after you’ve notified the site owner to remove your link, you no longer had control of what happens next. But with the addition of the disavow tool in Google, you can now take matters into your own hands and manage the backlinks coming into your site. it’s a great step in cleaning up the web and improving the relevance of the search results overall. You can find out more about the disavow tool here.
Google has always had the spotlight when it comes to search since it revolutionized the way users access the web. It’s grown to a point where in the last year they consolidated all of the privacy clauses into one, giant blanket one that affected all of their online properties.
An example of moving forward with search, I’ve mentioned a handful of times in the blog, is the DuckDuckGo search engine. Recently the small search company produced a video where they talked about how Google has each use caught in what they called a search bubble. Where they took more than 100 users, ensured they were not signed into their Google accounts and had them conduct searches on specific terms and captured their results.
What they found, was that even when the users were not signed into their accounts, and even in the same geographical area, they received differing results pages. It’s not a revelatory video really, as Google isn’t the only company on the web that utilizes browser cookies to determine who a user is and what they may like. Not to discredit what DuckDuckGo is hinting at, but with such a small sampling, and by allowing users to use their personal computers without clearing any session cookies, it’s no wonder the results were different for each user. Perhaps with the addition of a control group, a group of 20 users or so who were using completely clean installs of a browser and OS would help balance their results.
The numbers for the past month in search came out, and while seeing Google on the top for the majority share, what was somewhat disheartening was the continuing slide of the Yahoo position.
There’s been no real shift in the overall numbers, Google is still sitting at just over 2/3 of the search share, and Bing is following up with just 15% share. Yahoo slipped even further than the previous months, to just around 12%, giving the combined search engine just shy of 30% of the market. Yahoo was one of the primary search engines and one of the first to roll out a paid search marketing platform so to see them slip further out of the limelight of relevancy. Change however, is inevitable and is always a good thing for all parties involved.
The search share numbers aren’t terribly surprising in the grand scheme of things, and perhaps it was the additons of Panda and Penguin to the equation, but the number crunchers are at it again. On the webmaster forums there is discussion going on what the current algorithm may contain and how it might use analytics to help rank the sites in the index.
Some interesting theories are coming out of the discussion, mainly because no one outside of Google really knows the process for ranking the sites within the index. Google has mentioned previously that they don’t use any search data from their Chrome browser, and the running theory so far is the idea that the search giant is using click data from ISPs. In the end it’s only the team at Google who really knows how the engine ranks it’s results.
There’s been a small surge of malware reports coming from the searches via Bing and Google, which really isn’t news in and of itself as they’re always buried within the results somewhere. But what is different, is that more than 90% of image results were found to be malware related on some terms.
The most targeted term this time around happens to be “Emma Watson”, whom McAfee has named their most dangerous celebrity search of 2012. Of the two engines, 30% of Googles searches had malware warnings attached and more than double that came out of Bings results. Malware take overs happen in a couple of different ways, one of the most frequent are websites built with little to no security written in, and then there are throw away websites and urls used purely for the spread of malware on the web.
Black hat SEOs typically go after the hottest search terms and poke around the web looking for websites which have loop holes in it’s security. They actively work to hijack the website and it’s url, to help lend false authority to what ever term they’re wanting to spam. And because uneducated or hasty users tend to automatically trust the top results in the search engines, the spread of malware will continue.
Because of the recent discoveries that image results are getting slammed with malicious results, where the text results pages are beginning to be left behind, Bing has been unofficially dubbed (currently) the most poisonous search engine. The only reason that the moniker has been attached to the search service is due to the recent report about malicious websites being targeted at image searches now as opposed to the text results pages. Not to fret however, as Bing and Google will take steps to close those holes which have been opened in the image results, and in the meantime just be a little more cautious before clicking that top image of your favorite star.
There’s another minor change coming shortly with Google Adwords campaigns, it’s not exactly something new, but it’s a change that will affect those who are less diligent in the campaigns.
Google has always been a stickler for content, no matter how you feel about the company or the way it conducts it’s search business, that’s always been the case. With the trillions of web pages out there, sometimes the job of finding the most relevant results per search can be difficult, but they do their best. The advantage they have with the Adwords platform however, is they’re able to monitor if you’re actually following the rules they laid down. One of the bigger rules that the search engine has finally decided to become tough on, is just where your ads point too. From the Adwords blog:
Our existing policy has required each sitelink in a campaign to link to a different landing page. That means a user would have a meaningfully different experience on the landing page from each sitelink.
The reason that the company is going to start laying the smack down? They’ve done some checking and noticed that multiple ads are all heading back to the same page, a no-no in their guidelines.
What is going to happen moving forward is basic, initially the Adwords team will focus primarily on the new ads and campaigns that are created, ensuring that the landing pages are unique. If your link matches an existing page, your campaign will only list one of the two, the other will be restricted from showing up. The idea of the enforcement is obvious, they want to give everyone the chance to appear on the results, the best way to do that is to make sure everyone is following the rules they’ve set out. Now don’t fret, and exclaim that you need to tear down all of your campaigns and rebuild them from scratch. The folks over on the Adwords team realize that this isn’t a small effort to take care of your sitelinks so enforcement for existing links will be a few months out. This will affect all new sitelinks going forward at first.
Canada is still really coming into it’s own online, and perhaps Google has come up with the right idea to encourage some growth. The Google eTown award is designed to recognize those towns where small businesses are investing in online tools and resources to find new customers, grow their business, and improve their operations.
Chris O’Neill, Managing Director of Google Canada
“We’re delighted to recognize the accomplishments of the small businesses embracing the web in each of these cities,” said, we know that the Internet is going to contribute massively to Canada’s economic growth..”
We’ve been saying for a few years here at Fresh, that online business is still coming into it’s own for all us Canucks, hopefully this new incentive will help spur things along. They’re not playing any favorites, and have divided the country up into 5 zones, Atlantic, Quebec, Ontario, Prairies, and BC & North. This year the 5 winners are: Moncton New Brunswick, Dorval Québec, Parry Sound Ontario, Canmore Alberta, and Duncan British Columbia.
Across each of the winning towns, business has discovered that the Internet became a vital tool for reaching new customers and engaging with existing ones. In recognition of their eTown status, Google Canada will be presenting the awards to each city during events throughout October. Local businesses will be invited to attend and meet Google experts who will be on hand to provide advice on growing your business online. Actively working on your brand image is one of the key initiatives that you need to pursue while growing online. Being able to translate your offline business into your online image is one of the goals you should be aiming for, and when you’re ready for that step, Freshtraffic is here to help.
There’s been a couple of Google updates in the last week, which has had a visible change on the results pages. One of the updates was fairly simply, it was a Panda update which according to the available information impacted less than 1% of searches. And the other update was in how the results page returns your search results.
The Panda update was another in a long string of updates designed to improve the overall search quality, and with the change affecting such a small sample if you’re following the rules then you shouldn’t perceive any issue. If you’re the numbers type of person, you can compare the less than 1% affected from this last update, to the 12% affected when Panda made it’s initial break out onto the web. One of the points to bear in mind as well with this news coming straight from Google, there may be some surprises in store under the hood of this latest update.
Maybe the extra they put into the Panda update has something to do with the tweet from Matt Cutts earlier this week?
Just fyi, we rolled out a small algo change this week that improves the diversity of search results in terms of different domains returned.
Diversity in the search results is a great growing point, as it gives more opportunity for optimization to show it’s mettle. It’s possible, to have an entire first page of results completely dominated by a single website, if your optimization is that strong and your competitors are that weak. While the change was added only this week, reports are that it hasn’t yet been noticed in the bulk of the results.
And it’s a bit of a throwback to the beginnings of the web, but according to a recent posting, the keywords tag has returned. According to Google News Product Manager Rudy Galfi:
The goal is simple: empower news writers to express their stories freely while helping Google News to properly understand and classify that content so that it’s discoverable by our wide audience of users.
Similar in spirit to the plain keywords metatag, the news_keywords metatag lets publishers specify a collection of terms that apply to a news article. These words don’t need to appear anywhere within the headline or body text.
A small difference in the way the news_keywords tag works, is you’re limited to 10 terms, likely to stop users from cramming as many terms as they like. It’s a new way to use an old tag, and it’s going to be interesting to see how the use plays out.
It’s been long clear that a search engine is a search engine, and there are a handful online that receive the majority of the traffic out there. Google, Bing and Yahoo are the usual methods which are used to search, with other sites like Blekko, Duckduckgo, and Ask also being used by those desiring a different experience. Facebook however, is one of the largest websites online, and with what is approaching a billion users, the world has been waiting to see if Facebook is going to try and enter the search arena.
Ever since Facebook has gone public, the stock has been a sort of tepid pool, with no real revenue model the online mutterings often go to the topic of a search engine. And when Mark Zuckerberg throws around statements akin to “Facebook is doing a billion searches per day without trying” the mutterings pick up some volume. In a recent Techcrunch interview, Zuck made it clear that the company realizes that there is a huge opportunity for a search engine with Facebook, but tempers that by also talking about how the way that users search is ever evolving.
“Search engines are evolving” to “giving you a set of answers,” Zuckerberg said. ”Facebook is pretty uniquely positioned to answer a lot of questions people have. At some point we’ll do it,” he went on. “We have a team working on search.”
There are some real concerns about just how Facebook could leverage their massive user base as a search engine however, and it has less to do with spidering capabilities than it does privacy. The system as he briefly described and envisioned, would mean taking the opinions of your friends, family and contacts and trying to form a result for your query. Searching for terms like ‘best burger’ or ‘new batman movie reviews’ wouldn’t necessarily be informational, but would deliver you a list of opinions from your contact list. At any rate, it’s not happening today or tomorrow, or even soon for that matter. But (if) when it does, it will be introducing change into the search landscape and the online experience as a whole, and change is very, very good.