So unless you’ve been living under a rock, 2012 is set to be an immense year in the mobile search industry. There was some talk at the recent Mobile World Congress event to shed some light on mobile usage, both in European countries, the US and Japan.
And out of a survey of 1,000 respondants the answers are still, quite surprising. First off there’s the point that using a smartphone to search has made nearly 100% penetration in the market, and most of those search at least once a week. And according to StatCounter Google browser based mobile search accounts for 97% of that share. If that isn’t enough of a spur to work on your mobile site, how about when you consider the social side of the mobile web. Particularly in the US, it was found that over 90% of smartphone owners sought local information in their searches, and that smart device owners were heavy social networking users. Taking that 90% of local searches, 25% of them made a purchase based upon their findings and more than half of them contacted the business they found.
That’s only the US numbers, and already it’s easy to see that the numbers are quickly climbing. Some of the other discoveries made can be found here, but some of the highlights contained within I’ve pasted below.
Half of mobile shoppers make a purchase on their device, and 20% of those (US) make a purchase daily.
More than a third of consumers admit to carrying a smartphone in order to compare prices while they shop.
More than 1 billion people (globally) will use mobile devices as their primary internet access point.
There’s only some of the data reflected by the study, are you and your website ready for the mobile web?
Have you noticed any shifts over the last couple of days in your search results? As a site owner or an SEO for a client, have you noticed any changes as of late? You wouldn’t be alone in taking note, and you would be correct. It has recently been confirmed that Panda had been unleashed on the web again, making it even more accurate and more sensitive to changes online.
Some site owners are noting huge gains in their organic results, perhaps because they’ve attended to any issues that cropped up when Panda first passed over their site and erroneously booted them. On the other hand, some sites were hit harder than they have been previously by the update, and continue to flounder in the search pool. It may be a good, or a bad point, but Google also came out and said that the entire update hasn’t finished yet, it probably will only do so tomorrow. As well, there are still some high numbers being reported on forums, about being dumped in the results by Panda, but if you’ve been on your game and following the good practices guide you should be sitting just fine.
For all of the updates that are done to the various search engines, for all of the tweaks they do to their algorithms there remains a very simple truth. Stick to the basics and it’ll work. It may take longer than trying to work out every single step of the algo, but so long as you concern yourself with sticking to the best practices guides provided by the search engines, your site will list. And will continue to list, so long as you haven’t done anything naughty that is, to get yourself kicked out of the SERPs.
So I guess it stands to reason, that Google would jump at the new privacy bill being pushed through the White House. Regarding the new online privacy legislation, Google decided to get behind “Do Not Track,” technology that lets users opt out of tracking by websites and online advertisers. So what exactly did Google just agree to do? It will add support for Do Not Track to its Chrome browser. The way the technology works is fairly basic, it sends a Chrome header signal to the website to tell the website and it’s advertisers to not track that visitor. In browsers that already support Do Not Track, a user only has to set a single option. In Firefox, that’s done through the Options (on Windows) or Preferences (Mac) pane by checking a box marked, “Tell web sites I do not want to be tracked.”
Google has made its missteps in the past few months, there has been a great deal of discussion on the web about the bigger topics. The main point that typically crops up: privacy and security; especially with the latest Safari/cookie gaffe. The fact there isn’t really true privacy online not withstanding, the web isn’t a sandbox, there are options with which to keep some of your interests your own online.
Scroogle used to be a top destination for users who wished to conduct private searches. Recently it has been the target of DDoS attacks and had been throttled by Google and as such the plug has been pulled. Scroogle basically provided a proxy to conduct your searches, so that your search history would at least remain in your control. The most recent player in the privacy game, is DuckDuckGo who made their debut into the search game by comparing how Google tracks your searches, and DuckDuckGo wouldn’t. There are other search alternatives out there as well, which either encrypt your searches, do not create cookies on your computer, or store your search information on it’s servers. Google even provides its own version of encrypted search services when you visit the https version of their site.
When Google was taken under the leadership of Larry Page, he quickly and decisively pointed the sights of the Google machine on the social target. And in the last few months there have been a few, hiccups, with Googles changes in policy.
So it’s no secret that Bing and Google aren’t the best of friends, but with Microsoft behind the Bing machine, it was a shock for the web to suddenly find Google labelled as malware.
You may think it’s really not that big of a deal, but it only takes one red flag to turn many novice users away from using any service or website. The mistake has since been ironed out on Microsoft’s end, and Google is no longer labelled as a security risk. Malware is a rather generic term, basically covering any kind of code or software which either steals your private information or messes up your computer enough that you can’t really use it effectively. Unfortunately for those same novice searchers and computer users, malware has another, more inconvenient side.
It should be no surprise that scripters and hackers who work to develop malware, are also tied to the black hat side of the SEO world. Search is a multi-billion dollar a year industry, and being able to sit atop the search results for highly competitive terms for even a few days is a million dollar industry. Many times this is where you’ll find a specific type of malware usually known as ransom software. What happens is when a user clicks on the address of what they innocently think is their top results choice, instead they’re greeted with a popup message usually along the lines of “Your computer is infected – click here to protect your data!” And once that user clicks the button, they’ve been hooked. Once that back door has been opened, it is nefariously difficult to shut. It often leaves you open to backdoor access as well, which the scripter can use to steal your information, or even use your own computer to attack other unsuspecting searchers.
The first step to defending yourself is to have a proper anti-virus product, even a basic one will stop the majority of malware. The second step is to know what you’re seeing when you search. A proper website url will be www.this-is-a-real-site.com/yourresults.html, shown in green below your search results. A strong indication of a hijacked site or possible malware trap is when that address looks like so: www.possibly-malware.com/?p=23466. If you find an address which begins with a query string, there’s a good chance you’re not going to necessarily end up where you’ve hoped.
Money is a great thing, it’s needed for pretty much everything you need or want in this world. There are times to save money, and there are times to spend it. With the new year still fresh, now is the time to spend on your online presence so you can make 2012 your best earning year to date.
Search engine optimization is, for some odd reason, still a largely overlooked advertising expenditure. The internet is the ultimate store front, it never rests, and is always waiting to bring customers to your doors. It takes time, patience, an understanding of your current website and traffic, and what your ultimate goals are to even begin to craft an SEO campaign to implement.
There’s no ‘one size fits all’ version of optimization, as each and every client and website has it’s own unique set of problems. When you’re in the market for SEO, you need to bear that point in mind. If you’re searching for someone truly qualified in the area, there’s a very high chance they won’t have pre-packaged services for you to choose from. There are really 2 main steps when you’re hammering out the details of your costs associated with search engine optimization. The first, and one which affects your cost, is what is it that you’re trying to achieve and what key terms are you interested in. If you’re looking to rule the SERPs on a term which returns tens of millions of pages, your contract will have a steeper cost as opposed to a more niche market. The second step is where the compromising comes into play where terms are concerned.
Working as an SEO, we see the web a bit different than other people do. I know I haven’t browsed or used the internet the same way since I’ve began. Sometimes the keyterms clients choose need some adjustment, and through discussion we decide which route to pursue. It can mean the difference of a page 1 ranking, or a difference of a few thousand dollars in the term of a contract. Our goal in the end, is to bring you all the traffic you can convert, are you ready for the 2012 rush?
There are many key elements once you’ve built your website which you need to stay on top of, besides trying to focus on search engine optimization. Updating content, perhaps having a blog or a Twitter account with which to interact with customers/clients, and if your niche demands it, publishing a newsletter or email campaign to keep subscribers in touch.
In the background of your website, there’s also another element which needs consistent attention. Everytime you add a page to your website, create a new form or maybe add a photo album, your sitemap needs to be attended to. A sitemap is exactly as the name implies, it’s a table of contents for every page on your site. If you add, change, or remove pages, you also need to update your sitemap to reflect the changes. Typically your sitemap will be in xml or html format, but the important point is: it needs to be updated everytime you make a change.
Until now, there’s never really been a way to validate your sitemap without waiting for a short while until the search engines pass by your site to index it. At that point you could sign into your webmaster tools or site analytics and verify that you’ve either done things right, or if you needed to make some adjustments. But now for those who may be a little less technically inclined, Google has added the ability to test your sitemap before the spiders get to it, to make sure that everything is done correctly. This update, as well as a handful of other new and upcoming changes to their site tools are detailed on their blog post.
Over the last couple of weeks people have been hacking and slashing at Google because they’ve rolled out a change to how your results pages show up when you conduct a search. They’ve dubbed the change “Search plus Your World” and the idea is you receive Google+ data while signed into your Google account and conduct a search. Personally, I really don’t see the issue with their idea and here’s why.
Number one reason, if you’re signed into your Google account, searching Google.com, why would it surprise you to find publicly available information from Google+ in your results pages if it’s relevant? And from all of the screenshots of the integrated social results, a click of a button and they’re gone. Another argument I’ve seen about Google integrating the information into the SERPs is they are prioritizing its own content instead of linking out to third-party sites, which arguably is the whole point of a search engine. Valid point to bring up, but again, you can simply shut the option off with a few clicks at most. In the online world where 800 million or so people are used to the “opt-out” model thanks to Facebook, it’s almost surprising that it’s taken this long for another major web player to try it. Twitter and Facebook even backed a small browser bookmark of sorts to help cull out the Google+ results from your results pages. It’s outraged enough people, that bloggers are already forcasting that Bing is the new King of Search.
It’s perhaps those last two points which contributed to my puzzlement. For all of the people up in arms with Google and switching over to Bing, I can only assume two things. You were born on January 1, 2012 and you don’t have a Facebook account; amazing really considering there are so many. Here’s a brief excerpt from an article stabbing at the changes Google has recently made:
The new feature is baked right into Google and aims to personalize your search results by including Google+ data when you are signed into your Google account.
And here, is an excerpt from an article written in May 2011:
The worlds of SEO and social media were rocked the other day when Bing announced they will incorporate Facebook data into their search results for the most personal social-search integration to hit the web. What does this mean for the user? If you search for something on Bing and are logged into your Facebook account, you will see which pages, products and websites your friends Like and recommend high in the results, regardless of where that page ranks in the general SERP.
Perhaps Facebook should recite the idiom, people in glass houses shouldn’t throw stones, as Bing and Facebook have been at social search integration coming up quickly on a year of implementation.