How did the internet get started? Here is one story you might find interesting
In ancient Israel , it came to pass that a trader by the name of Abraham Com did take unto himself a young wife by the name of Dot.
And Dot Com was a comely woman, broad of shoulder and long of leg. Indeed, she was often called Amazon Dot Com.
And she said unto Abraham, her husband, “Why dost thou travel so far from town to town with thy goods when thou canst trade without ever leaving thy tent?
And Abraham did look at her as though she were several saddle bags short of a camel load, but simply said, “How, dear?”
And Dot replied, “I will place drums in all the towns and drums in between to send messages saying what you have for sale, and they will reply telling you who hath the best price. And the sale can be made on the drums and delivery made by Uriah’s Pony Stable (UPS).”
Abraham thought long and decided he would let Dot have her way with the drums. And the drums rang out and were an immediate success. Abraham sold all the goods he had at the top price, without ever having to move from his tent.
To prevent neighboring countries from overhearing what the drums were saying, Dot devised a system that only she and the drummers knew. It was known as Must Send Drum Over Sound (MSDOS), and she also developed a language to transmit ideas and pictures – Hebrew To The People (HTTP).
And the young men did take to Dot Com’s trading as doth the greedy horsefly take to camel dung. They were called Nomadic Ecclesiastical Rich Dominican Sybarites, or NERDS.
And lo, the land was so feverish with joy at the new riches and the deafening sound of drums that no one noticed that the real riches were going to that enterprising drum dealer, Brother William of Gates, who bought off every drum maker in the land. And indeed did insist on drums to be made that would work only with Brother Gates’ drumheads and drumsticks.
And Dot did say, “Oh, Abraham, what we have started is being taken over by others.” And Abraham looked out over the Bay of Ezekiel , or eBay as it came to be known. He said, “We need a name that reflects what we are.”
And Dot replied, “Young Ambitious Hebrew Owner Operators.” “YAHOO,” said Abraham. And because it was Dot’s idea, they named it YAHOO Dot Com.
Abraham’s cousin, Joshua, being the young Gregarious Energetic Educated Kid (GEEK) that he was, soon started using Dot’s drums to locate things around the countryside. It soon became known as God’s Own Official Guide to Locating Everything (GOOGLE).
That is how it all began. And that’s the truth.
Anyone can build a website and make it active online. Once the search engines get to crunching through your pages is when the bots take your site apart. There’s a number of stand by factors which are widely known to influence your credibility online and recently some newer metrics have began to come into play.
Two of the most talked about and widely accepted forms of online credibility are linking on your website, both links coming in, and going out. The inlinks, or backlinks if you prefer, are the way that websites can lend credibility to each other. The more credible your link coming into your site, the better your site will be received on the SERPs.
The opposite of an inlink naturally, is an outlink. The more you link your site to other sites online, the more you can increase your authority in your niche. If you’re not careful when you outlink to another site however, you can inadvertently damage your website, or the site you’re linking too.
Some of the easier to control facets of improving your credibility online involve straight forward, albeit it often forgotten, steps in your websites construction. Making your website easy to navigate for users makes it easier for people to bookmark and return to your site on multiple visits. The second step which makes things easier for you online is to include a way for the search engines to easily naviget your site as well. The best way to assist with that is to include an XML sitemap. Taking the added step to keep it up to date as you add and edit pages is essential for building and maintaining your reputation on the web.
Your websites reputation online is just as important as maintaining your SEO measures in order to keep on top of your niche. Make sure to take the time to investigate who may be linking to your site and your content and that any links you may have going to other sites are in good working order.
Republican Rick Santorum has had a few.. issues over the course of his public running for the head of the US office. All politics and personal opinions aside, Santorum has run into the most recent iteration of what used to be known as Google bombing.
A few years back if you visited Google and searched for the term ‘miserable failure’ the first result used to be the then president of the US, George Bush. Santorum unfortunately hasn’t been so lucky with what he’s become associated with, but the methods as to why he’s momentarily stuck there is due, in part, to how SEO works.
The bare basic principles of SEO – you create content, you contact people to link to it and repeat over and over again until you’re the king of the mountain. Backlinks, the fuel for SEO is what keeps you flying high in the rankings. An extremely popular social community, Reddit, has decided that the politician will remain associated with this alternate meaning for the near future at the very least.
You can think of the woes that he is experiencing as you will, on a technical note however it needs examination and repeating. Until Google, Bing and Yahoo outright remove a website from it’s index, it can and will continue to rank for the term(s) with which it’s been written to be found for. The act of completely removing a site from the index is an extreme measure that the search engines rarely wield, let alone with permanency. It needs to be noted that irregardless of the poor quality of the links, the relevance and sheer number linking to the offending website are what is keeping the politician tied to his less than stellar ‘definition’. To quickly deal with the issue the search engines could enforce a nofollow on the Reddit site but Google in particular deems itself a color blind indexing machine. Mr. Santorum would do well to contract the skills of a search engine optimization expert to assist with his character control.
A question often asked in the course of a discussion with a client, whether new or existing, is how do we determine our costs as a search engine optimization company. It’s not a single, or simple answer as each clients needs are unique as anyone fully involved in the industry will tell you.
One of the factors which determines your cost for your SEO, is what types of keyterms you desire to optimize your website for. If you own a company which manufactures toys and you’re the new kid on the block, being able to drive your website up in the SERPs is going to be difficult. Doing just a generic search for the term ‘toys’ returns 1.9b (yes, billion) pages. Granted that’s a rather generic term to try and optimize for, but each term is met with it’s own challenges to overcome where keywords are concerned. The number of terms you’d wish to be optimized for also lend to the maintenance cost of your website.
Another determining factor that affects your overall cost is the overall quality of your website and it’s content. If we need to sit down with you and assist you in rewriting each and every page due to lack of quality content, it’s a necessary step which needs to be in place before we even begin to think about scouring the web looking for back links for your site. As well, if your website is full of choppy code which needs to be addressed, or even if your website is so woefully out of date that a complete rebuild is in order, these also contribute to the overall costs.
These are only a couple of the factors which contribute to your companies online advertising budget, thankfully however if you need to rebuild your website or rewrite your content those are often one time costs. The consistent maintenance which needs to be addressed to continue ranking well in the SERPs however, is where the majority of your budget needs to be directed. And as the web becomes more and more competitive, that budget will need to be adjusted every 12 months or so.
Did Bing play dirty over the shopping holidays? If you tried at all this most recent Cyber Monday to use the Bing search engine, the signs currently point to yes, they did play dirty with their results.
The creators of the idea of Cyber Monday, found themselves lost in Bings search listings because according to Bing their content was too “thin”. If the term is familiar, it’s because it sounds a lot like Google-speak when they started rolling out the infamous Panda updates and culling “thin content” based websites from their index. A difference to note however, Panda didn’t actually remove the offenders from the index, it just meant the odds of those sites ranking well plummeted.
Back now to Bings version of taking care of thin content and removing websites which fall into this category. Cyber Monday is now a billion dollar online shopping event, where website owners have the opportunity to make some good money heading into the holiday shopping weeks. If a site which could promise and deliver strong referrals could rank well, they would also stand to make a fair bit of change. Shop.org came up with the term Cyber Monday in ’05 and a year later created the corresponding website, cybermonday.com. This past Cyber Monday Google had the website in their SERPs, while Bing did not. Bing did however, have their shopping channel listed at the top of their results for searching cyber monday.
Bing has stated previously that they will dispense internet justice on sites deemed unworthy to be listed as part of their SERPs, but completely removing any and all traces of a site? Bing defines spam as:
Some pages captured in our index turn out to be pages of little or no value to users and may also have characteristics that artificially manipulate the way search and advertising systems work in order to distort their relevance relative to pages that offer more relevant information. Some of these pages include only advertisements and/or links to other websites that contain mostly ads, and no or only superficial content relevant to the subject of the search. To improve the search experience for consumers and deliver more relevant content, we might remove such pages from the index altogether, or adjust our algorithms to prioritize more useful and relevant pages in result sets.
So by removing the cybermonday.com website, if Bing were to stick to their guidelines they should remove all “thin” websites which fell under the same blanket. Yet they did not entirely and websites which feature almost identical content to the cybermonday.com website still appeared in their results. To further muddy the waters, the Bing powered search results which were served up in Yahoo would turn up Black Friday “websites” which would be deemed even thinner than the Cyber Monday website. With all the fuss that Bing was putting up about Google favoring their own results over all others, this sure doesn’t look well on the Bing radar. The Panda updates may drop websites rank if they’re found as being too thin a website, but at least they’re not completely removing them from the index ala Bing.
I had an odd occurence recently in terms of how search is evolving and it involved a rogue browser extension. It’s mildly annoying when you have a toolbar become installed in your browser of choice, but it’s frustrating when it’s installed without your expressed knowledge, say by having the install clause buried in a EULA for another program.
The rogue extension in question was Surf Canyon, a real time search reorganizer would be the short description. With the internet being comprised of literally trillions of web pages, search engines like Google, Bing and Yahoo are the big hitters in locating what you need online. They all offer their own pros and cons, Google is the weapon of choice for the majority of searchers out there for the past 10+ years.
Real time search results have become a challenge for all of the search players, with everyone working to get a solid solution to serving up relevant results which compliment the current organic offerings. The idea behind Surf Canyon extension is that it personalizes the web for you as you search. A fine enough idea, what was actually noticed however was the extension has somewhat a mind of it’s own.
Toolbars are a nuisance in a browser, fake links on webpages are a pain as you don’t really know what’s real and what isn’t without clicking. But a browser extension which supplants false links into webpages which you know have no outgoing links? That’s poor business practice and sketchy access to a computer and browser history.
Is Bing more biased than Google when it comes to the results pages? In research that has been gaining traction as of late, the answer seems to be yes. It wasn’t a directed study on a few select terms either, it was a large random sampling of the SERPs conducted by a professor at George Mason University.
What he found in the tests that he conducted was that for the most part, Bing will favor Microsoft content more often and more prominently, than Google favors its own content. According to the findings, Google references its own content in its first results position in just 6.7% of queries, while Bing provides search result links to Microsoft content more than twice as often (14.3%). The percentages may seem small, but when you consider there are billions of searches performed daily, suddenly 14% isn’t such a small number.
The findings also cast a different light on the recent FTC antitrust complaints which Google has been handling surrounding anti-competitive behaviour. It’s also a stark contrast to a similar study done earlier in the year, which concluded that “Google intentionally places its results first.” So now as a user with two completely different data points, which is the set to believe?
Well the second study which has been conducted had two goals in mind : To replicate the findings of the first study and to also expand on the methods used to determine if it was perhaps an issue in how the results came about. From the very beginning, it was found that while Google does favor its own content at some points, the selection of terms is exceedingly small. What was also learned and wasn’t mentioned in the first study, Bing does precisely the same in preferring Microsoft results, but for a much wider range of terms than Google does and is much more likely to do so. “For example, in our replication of Edelman & Lockwood, Google refers to its own content in its first page of results when its rivals do not for only 7.9% of the queries, whereas Bing does so nearly twice as often (13.2%)”
As for the second part of the study, the study used a much larger, more random sampling of search queries as opposed to the only 32 samples that the first study used to portray Google as the big bad guy of search. And the findings of the second study were related in the beginning of the post; Google references own content in its first results position when no other engine does in just 6.7% of queries, while Bing does so over twice as often (14.3%).
So, what does this mean as an end user?: Google (and Bing, though less so) really are trying to deliver the best results possible, regardless of whether they come from their own services (local search, product search, etc) or not. It all comes down to preference.
There are many steps which are part of a successful organic SEO campaign. There’s all of the little steps like writing good content, making sure you have the titles and meta tags in place and having a menu which is comprehensive. When you’re finished with the good practices pages, you begin to read about one of the time intensive steps of the campaign, link building.
Since Panda has reared it’s head over the last year or so, there’s been chatter about how the SEO game has fundamentally changed. That scrapers and content aggregators, the black hatters and the link buyers would just disappear and we’d have pristine, precise results. Time has started play it’s part and while the scrapers, aggregators, black hatters and link buyers have mostly been swept away, there has recently been a new call to revamp the way the system has been working. The desire to change the link building metric portion of the search game sometimes comes up in discussion as the points for and against the practice are argued. When you break it all down to the basic points, primarily every search engine will tell you the same thing: content is king. If you produce quality, relevant content, you will rank in the SERPs.
The kicker about producing this kind of content however, is you will naturally receive back links to your site and it’s pages. When you’re a new site and you need to visit and email possible consumers and possible partners in the same niches, building those back links takes time. But they will be built, they will be taken as a metric by the search engines and until an algorithm can come along which can read and evaluate content as a user would, link building will be relevant. It will be an important portion of any and every organic SEO campaign no matter how big or how small. The success of your link building campaign can be directly tied to how much work you’re willing to put into contacting those who are in an industry which compliments your own.
It’s reassuring, that even though some businesses out there are slow to improve their websites or their online marketing toolset, the trend is slowly but surely shifting. While still only a fraction of the marketing dollars spent out there, the numbers are showing that around 17% of most businesses marketing budgests are being spent on online marketing. Any positive growth is good for everyone involved.
A great graphic depicting some of these changes has been put together, which outlines some of the changes coming about in the marketing world. In the US, 70% of the businesses out there have indicated that they will be increasing spending on social media advertising (Facebook, Twitter, Google+) and 64% also chimed in to add their budget is increasing for SEO as well. With consumers spending more and more time searching online for their next purchase, it’s much more advantageous to get into the game now, as opposed to later. The longer you wait, the greater your costs are going to be. Surprisingly however, it came back that 17% of businesses out there planned on increasing their marketing budgets on print media, which is much like buying stock in Yahoo these days. I kid, I kid, all jokes aside however, almost anyone out there who has a job has access to the internet. It should be no surprise that on average people spend 3+ hours browsing the internet. 84% of people who use the internet, spend their time searching for information on what has caught their interest, there are billions of searches per day.
There’s a great deal more information which can be gleaned from the stats, have a look and take a moment to conisder your marketing plans. Are you on the side of innovation and forward thinking? Or trying to cling to an outdated, unmeasurable stand by. Just remember that the longer you wait, the more difficult the game becomes.
So remember a little while back when Google decided to try the whole social thing and launched Buzz? And it cost them a few million because they “oops forgot privacy”? Well the FTC has finally decided how to handle the giant and it will throw a bone at the privacy concerned members of the public to boot.
For the next 20 years, Google will be subject to privacy monitoring from the U.S. Federal Trade Commission. By using what are being called “deceptive practices”, the FTC will babysit the search giant thanks to its now dead Google Buzz social networking service. The investigation into Google’s privacy practices began after users complained that their Gmail contacts were made public, and that the steps to protect their privacy weren’t clear or effective.
When the service launched, Gmail users were given the option to participate in Buzz, but what Google failed to mention was that the people they email most often would be listed publicly. Those users that declined to participate were also automatically enrolled into at least some of the Buzz features without their consent, according to the FTC investigation.
“In response to the Buzz launch, Google received thousands of complaints from consumers who were concerned about public disclosure of their email contacts which included, in some cases, ex-spouses, patients, students, employers, or competitors,” the agency said. “The FTC charged that Google failed to disclose adequately that consumers’ frequent email contacts would become public by default.”
The FTC also added that Google had misled the public in regards to its privacy policies, and misrepresented its compliance with U.S. and E.U. Safe Harbor “or other privacy, security, or compliance programs.” So for the next 20 years Google is going to have a monkey on it’s back with the FTC being able to watch their every social change and if the search giant isn’t careful, may find itself back in hot water.