The US Thanksgiving has come and gone, and with it Black Friday, the occasion when everyone tries to find the best deal. But the limelight is slowly turning to focus on the new comer to the shopping scene – coined Cyber Monday in 2005. We’d written about the date in early August, as with the internet and the search engines working the way they do, it would give you time to put yourself in a commanding position. Did you take advantage of the forewarning? Or did you just settle for where you are, and lean on your in-store sales? If you did the latter, you’ll likely soon be kicking yourself as the predictions and the numbers are starting to come in.
Cyber Monday was first used in 2005 after the increase in online spending had suddenly jumped. Since then, the industry has climbed to being such a huge business that some stores are reporting that nearly 40% of their yearly income is from this singular online shopping day. And that number will only continue to grow. This year it’s estimated that on this one day alone Americans will likely spend somewhere in the neighborhood of $1.4 billion dollars in 24 hours, and that’s up a good 17% from last years online spending. With more and more people having more connected devices, from phones to computers, iPads and laptops, the lure to shop online is growing rapidly. The best quote about from comScore came would have to be the following:
Of all the benchmark spending days, Thanksgiving is growing at the fastest rate, up 128 percent over the last five years
That’s a huge portion of income that you could potentially be missing out on just by not taking advantage of the online branding advice we hand out freely here on our blog. The number of 17% growth, year after year needs to be taken with a grain of salt of course, as some industries can expect consistent sales, but as a business owner you need to take stock. What could you achieve with a better online position, what improvements could you make with a 5% increase in income? What about 8%, or 10% for that matter? When you’re ready to find out, contact us here at Fresh and we’ll help you answer those questions.
Search is a finicky thing on it’s own, let alone when you start throwing all sorts of (seemingly) random variables to serve the results pages. Both Bing and Google have their own set of checks and balances which they use to deliver the results page based on your search terms. As varied as the internet is, there will be metrics that both of the algorithms use, and the differing ones are those that make the search results unique in their own way. These algorithms that are in use have developed and grown over time, as has the search market and the way it functions as a whole.
The search market started out in a very basic way, you typed in the terms you wanted to find information on, and the spiders searched through their index that they’d built and tried to return to you the results they felt best matched your request. The query you used was taken by the spiders and they searched for the exact terms and anything that matched it, search began as a relative function. As the definition goes, relative means in relation to, so if you searched the term ‘red rose’ as an example, you’d not only get images and descriptions of flowers, but you’d also likely end up with pages of the baking flour as well. Both items are relative to the search term you’ve entered, so it would make sense to a bot to show you both, as it couldn’t discerne what you were searching for.
Now the web has grown up a lot, it’s started to mature and has developed some, almost scary, tricks. It’s a term which has been thrown around a lot in the last 6 months especially, but it’s regarding the growing nature of semantic search. The simplest way to describe it would be with that same term from relative search – ‘red rose’ – with the way the web and search is evolving the bot would act intelligently. It’s being seen more and more often in Google, Bing and the social networks out there, because you’re an avid baker, the bots would likely serve you results pages more populated with the baking flour, and associated websites with recipes on it. Now it won’t bet the farm on you wanting the baking results, so you’d also receive some of the flower on your page, but it’s a best guess situation.
Semantic search, and likely presumptive search is the way we’re heading. Soon you won’t even really have to search for an item or a website, the bots, or whatever technology it is running things at that time, would know what you’re looking for within your first few terms you type. It might seem scary, it may even seem intrusive at this point in the way the world works and how people think. The simplest truth however is, this is where the web and search is going. It also means that from my point of view, the job of online branding and branding online will become vastly more important than it is today.
When we build a website for a client, whether they’re in Winnipeg or anywhere else in the world, we make sure that any kind of forward thinking marketing is covered. And since we’re in the business of online branding and internet marketing, we try and make sure that each website we develop has the capabilities to become a leader in their niche, so long as they decide they want too. We didn’t just come up with some arbitrary stats which we settled on, there are some very specific points that we look for. We’ll go over a handful of the options this time around, if you’re involved in the industry in any way, you’ll probably recognize some traits in the platform you use.
One of the very first points that is a necessity, is being ablt to customize page titles, and the meta tags of each page. If it’s a properly built website, and you’re following the best practice guide that both Google and Bing have readily available, then you should know already that having an identical title or tags on all of your pages is a big no-no. You should at the very least be able to customize each page title, meta data, and your header tags, if you can’t manage these very basic snippets of information on your site, then you’ve already started off on the wrong foot and we haven’t even gotten to the hard stuff yet!
I’ve touched on this point several times, but when you’re building your site you need to think about the navigation menu. And I’m not referring creating a singing and dancing menu that thanks a visitor for being a part of the website experience, I’m looking more at a navigation menu that uses CSS to control the display elements. You can have an impressively interactive navigation menu just by using CSS elements, which are easily indexed by all of the search engines and are much more responsive than a java or flash equivalent. Besides being responsive and a solid display method, it also allows you to control the contents of the menu, so if you happen to make a spelling mistake, don’t be surprised to find it indexed if you’re not paying attention.
It is an often overlooked feature, as a normal site owner doesn’t usually think about the website link beyond the main address, but being able to control how your URLs are created is a major point where best website development practices are concerned. If you’ve ever been on a major online shopping site like Ebay for example, if you’ve ever copied and pasted a link of a page to an email you’d notice the link contains a mess of letters a numbers (=item20cdb2380c&_uhb=1#ht_599wt_1139). These letters and numbers aren’t there for users, they’re definitely SEO unfriendly, and need to be avoided at all costs if possible.
These are only a couple of the very basic best practices that you’ll find discussed in any of the website development guides out there. If you’ve got the time, you should work your way through your site and if you have it, your CMS backend and ensure that you have all of the above listed functionality. If you’ve learned that you don’t have these capabilities, get in touch with us here at Freshtraffic as soon as possible and we’ll get that taken care of for you. The longer you wait on necessary changes like the above, the deeper you could be lost in the results pages.
It’s no secret that Google is the big kahuna where search is concerned, and they make enough money year after year they should have their own printing press. But for the last year or so especially, Google has been the target of some anti-trust and privacy issues across the globe, with advocates pushing for more from the search giant. Claims that it takes too long to clean up your past from the search engine, and blaming the provider for results deemed inappropriate.
The web is at it’s core, a giant repository of everything. Pictures, videos, text, scripts, code and trillions upon trillions of 1s and 0s that make up websites and documents. It is often a strange sensation to be able to go back to an old website you used to frequent, read some of your past ramblings and wonder, what was wrong with me, or, why would I write something like that? With the way the internet holds onto its history, you can often find information about anything or anyone for that matter. You would be hard pressed to think up a legitimate search topic that wouldn’t appear on a search engine somewhere, and it’s highly likely that Google as well has it indexed and stored on one of it’s multitude of data centers across the globe.
It’s that level of access to information that seems to have the hackles of some of the population up, and has them trying to call for regulations on search engines. Soon it won’t be just Google that will be caught up in these privacy and anti trust regulation talks. Google is being made an example of because they’re the biggest target out there, and so, who better to hit. The plain and simple point of contention of access to information isn’t a search problem, I’d blame it more on a generational divide. The yougest users of the web, those 13-18 year olds have grown up with 24/7 access to the web and all of it’s content, while the top end of the user range, that 65+ age range, sees the internet in a completely different way.
40 years ago when a family went on vacation and took snap shots, they didn’t share them with 400 of their friends on a social network. It was maybe the 6-10 close family friends that they shared their details with, and so they could control their information and had a semblance of privacy. Flash forward to now with the same family, and you have little sister posting pictures to Instagram and Facebook, while the 17 year old son is watching a steaming Netflix movie. Mom and dad are using a GPS navigational system with turn by turn functionality, and are setting up a video chat with the friends they’re on their way to visit. Everytime that photo is viewed on Facebook or Instagram, it’s being saved with another web address, in another location. Everytime you’ve used your Skype or iPhone to conduct a video call, the connection and duration has been saved on a data server, and every movie or show you stream online has helped define what your likes and dislikes are with the service, so you can have a better targeted product to view at a later date. It used to be called personal accountability, if you didn’t want to be viewed in a certain way, you just didn’t act that way, and it’s become even more important to conduct yourself well.
Privacy hasn’t disappeared, but it’s definitely not the same as it was 40 years ago, as a person living in the digital age you need to be acutely aware of your online conduct. Because everything you say, do, or post is saved somewhere. Google, Bing, Yahoo, and all of the other search engines just search for information. They do not operate with bias or under the control of some megalomaniac with a god complex who is out to control the world. All they do is take a mess of 1s and 0s, and display them in a way that a person can understand them. And just remember that the information that people are trying so hard to push Google to bury, erase and hide, can be found just as quickly on the other major search engines out there.
Does search engine optimization need to go the way of the dinosaur? If you follow any of the reporting outlets out there, it’s usually a couple of times a year that they’re bringing out the funeral procession for the SEO industry, but since it became the defacto method of gaining visitors it hasn’t budged. But is it really, finally time to bury it?
Before I get too far ahead of myself, it needs to be said that the search industry isn’t going anywhere, anytime soon. With more mobile devices connected to the web than there are people on the planet, the method of delivering results is going to continue shifting to that marketplace, but it will never disappear. A better way to pose the question might be, is it time to lay the term SEO in it’s death bed? Search engine optimization began it’s time as the goody two shoes brother to black hat results page spamming that plagued the internet in the early days, and still to this day plays it’s own part in the search world. Black hatters, for all their dastardly intentions, actually play a vital role in the search market, because if they didn’t exploit things and try and find ways around the algorithms, then it could never improve at the results it provides.
But that aside, with the prevalence of information on the web about best practices, Bing recently came out with their own version and it reads much like the Google one, blogs, forums and podcasts about some of the methods and means of working with your site and it’s content to rank on the web, it seems like these days everyone is trying to be an SEO expert. And with this happening, the name and it’s methods have become muddied, with conflicting steps and methods, with some who profess to be gurus who outright skip using the basics.
It is a difficult decision, to try and come to a conclusion of burying the term search engine optimization, but then what to use in it’s stead. Online marketer, online branding consultant, perhaps internet consultant. The one simple problem remains though, that no matter what moniker gets attached to the industry, eventually everyone who followed suit into the SEO realm, they’ll follow along with the new buzz term. Maybe with that in mind, it’s just as well to let SEO live for another day, for now.
Online marketing and branding is can be an intensely competitive market, made even more difficult with there being billions and billions of web pages out there about everything you can imagine. And while they say imitation is the sincerest form of flattery, it can tend to be a death note where the search engines are concerned.
With the web being so massive, it’s can be often difficult to say where content originated. Images get copied, text gets scraped and snippets of code gets replicated across the web on untold amounts of websites. Where organic optimization is concerned, it’s a time intensive process to prove original authorship in some cases, and even then it may not make a ton of difference. There is a difference however, where paid advertisements are concerned, such as with Adwords campaigns.
Adwords is a much different platform from organic search, the biggest being you’re paying for your positioning in the results pages. You bid for your chosen keywords, and if your ad copy and your bid are better than your competitors then your ad will appear, frequently before theirs. It’s a lucrative search market namely because it’s where people make their snap buying decisions. Sometimes, there are companies out there which play a little dirtier than others, sometimes copying ads copy directly, or even copying ad titles and format. It is a dirty business practice, and you can compare it to Pepsi mimiking a Coca Cola commercial or tune.
As dirty as it is to copy your competitors titles, copy or entire text, due to the nature of the business they may be allowed to run the ad, that is of course unless you dispute their usage. A prevalent argument that is often found in these cases falls under the Adwords informational site policy, a long winded document that exists to cover the usage of trademarked terms use in Adwords. It basically limits the use of a trademarked term to the original mark holder, or a reseller of the product. The loop hole exists however, when you get to the portion of informational sites, which can carry the trademarked text if the landing page of the ad is informative in nature to the written ad text. Now just because the loop hole exists, it doesn’t mean you’re out of luck if your competitor runs an identical ad using your text, your primary step should be to file a dispute in your Adwords account against the ad. You’re also covered in the same trademark policy text where it basically says you can’t use a trademarked term if the goal is to take sales away from the trademark holder.
Make sure to be diligent with your Adwords copy, and if you see someone using your very own text to try and snag away sales then you should be reporting them as soon as possible. If you let it slide, there’s nothing stopping you from losing your next big sale.
Recently Google went and turned on their own tool which enabled website owners to disavow selected backlinks coming to their site. Great tool, that allows a diligent site owner to take control over who links to them. The process is fairly basic as to the steps to follow, you create a text file which you upload to your webmaster tools account with the backlinks you’d like to have disavowed and voila, supposedly case closed.
It seems however, that some people aren’t content with the way the system works. After submitting his disavow list, and resubmitting a reconsideration request they were greeted with the advisory that there was still some bad links pointing to his site. The timeline with which this webmaster is unhappy with, has been a month since their initial submission of their disavowed links. There are a couple of theories about why there are still some problems, but there are also a handful of points that all webmasters who use the disavow link tool need to bear in mind.
A primary point you need to think of when using the disavow link tool, is that it is not an instant or a quick fix tool to any and all back links you might want to remove. Google has data centers all over the globe, and with that it has a number of different versions of your website at any given time. As odd as it may sound, it’s like using a collation system when working on a project through various stages of completion, so when you’re finished you can see what your steps were all the way through. Just like you could look at version 2 of your project development and have an idea of where you were, each data center will have a slightly different version of your site and it’s backlinks. It takes time for any kind of a clean up request to propagate through the entire system.
A second major point that needs minding, is you need to understand that just because you’ve submitted the disavow list, and/or asked the offending back links to be removed, it doesn’t mean it’ll happen quickly (as per the first point) or at all. The tool funtions much like asking another site owner to remove a link to your site, it’s a request, and if it happens you have no control over how quickly it does.
When I arrived in Canada in August 2007 I got to speak with three print media giants in their own area, The Yellow Pages, Winnipeg Free Press & Winnipeg Sun, in that order.
I told each and every one of them that they needed to change outlook and the way they operate to make dollars and survive going forward, all thought I was some cocky nut from Blighty.
Funny how things work out, The YP, they owned autotrader at that time and took what I told them as a slap in the face on how they needed to change direction on how they advertise, worked online etc., they sold it to the Brits losing $500 million dollars. Full Story Here
Then I spoke with the Free Press on a couple of occasions thinking I might get more traction in my adopted city, they had a circulation of around 500,000 at the time if we believe the stats, again they thought this guy is nuts, all he does is go on about Online, Google and Social Media, we are starting our own stuff on our website selling cars, real estate and banner ads, we know what we are doing, a few years later they tell me they are doing their own app and this is going to be killer, I asked for who?, I learned last week they are changing again as all this has not panned out as expected, were losing revenue and oh we are building a new website too.
Thirdly the Winnipeg Sun, I had a great meeting with the then boss Kevin Klein, a nice guy who actually had some idea of what the future might hold, but unfortunately they were tied to some boat company and there advertising ways, not good either, so today they announce they are laying off 500 workers to save $45 million a year. Full Story Here
The morale of this story, don’t judge a book or the person even if the book might be slightly X Rated, if that book has been a best seller around the world, maybe, just maybe they could be something in there your missing that could help.
The world has changed dramatically on how we get news, tweets to our tablet hours before the main news announce it, mobile uploads at the scene, if you have not lived this way of life for the last decade it’s very hard to catch up.
Shit Kickers we are, we have never denied that, but it’s all for our clients, when they win we win and we win alot.
It has been about a year since Google began encrypting the search data automatically for it’s signed in users, which has crept up in the search metrics to be a rather hefty portion. As it has turned out, almost 40% of the keyword results to a group of 400+ websites has returned the ‘not provided’ Google Analytics.
The 40% statistic of ‘not provided’ is just an average sampling of what has been found, some sites reported as high as a 60% sampling of the returned value. As a user of the web, on the surface it looks like a great feature that being signed into your Google account does not disclose how you searched for a site, but as someone who works daily on the web has been a slight hinderance. A very basic break down of how, as marketers, we would use search metrics to handle our site traffic is:
keywords you searched
page you landed on
does content match
did you stay on the page
That’s just a very basic run down of how we use the information that is now hidden to Analytics users. Now that Analytics users are getting the returned value of ‘not provided’ keyword for as much as 40% of their search results, it makes it tougher for them to discern visitor flow through to their website.
It sounds on the surface like a win for user privacy and control of your information, but in reality it makes targeted marketing campaigns much more difficult. Think of it in terms of shotgun marketing versus precisely tuned, instead of finding exactly what you’re looking for in your search results page, you have to start playing the back and forth game. Click a result, if it’s not what you want, search again and try again. It can make the process much more drawn out than it should be.
We’re firm believers that you choose the right tool for the job, and that’s why here at Fresh I am not a web designer. I can optimize a site, I can make it run like it’s on lightning and clean and organize code to an point of near OCD. Design a site however, and I fail horribly in the aesthetic aspect. It’s at that point that we bring in a proper designer to create the product image, and I take it from there.
It is very important that we do this too, as just as it’s a time consuming process to work the optimization to a proper level, so is the design of a site, all the way down to the color scheme. Color isn’t so much as a deciding factor in the case of organic search optimization, but you need to consider what visitor you’re building your site for. People, not crawlers are what is going to bring you the return on your website investment, and if a visitor is immediately turned off by the aesthetic of your site and clicks that back button, you can just as well chalk it up to a negative mark to the search engine results pages. If visitors don’t like spending time on your site, it won’t matter if you have the best content in the world, they won’t link back to it and you won’t build relevance which won’t help drive you up the results. It’s a circular issue, and it can all come down to a simple website trait, even just your color scheme.
It’s been offline for a great while, but anyone who used the web 10 years ago remembers the visual atrocity that was the Geocity personal pages. Gif laden with pixel graphics and flashing banners, it was a difficult task to navigate through them to find what content you might happen to be looking for at the time. I’m sure there are enough people out there who remember trying to read yellow text on a green background a time or two. The web has thankfully matured to a much better point, with high resolution graphics and some vastly talented designers coming up with some incredibly detailed and interactive websites.
So when you’re working on your image of your website, remember to keep things easy on the eyes. You can be stylish and still be functional, make your site easy to navigate and read. Make sure your images and graphics blend in naturally with your site and try and stick to color schemes that don’t assault your senses. It seems like such a trivial point when you think of the bigger picture, but even just the colors of your website can drive your site around on the results pages.