While the possibility that Android, a beloved smartphone institution, could be sued out of existence by Apple, Microsoft, it is alarming to many, this incident in many ways serves most of all to illustrate much broader problems with the U.S. intellectual property system.
Companies in the U.S. are laying claim to increasingly generic intellectual property and using that IP as instrument not to innovate, but to litigate. The street runs two ways in most cases — often times IP lawsuits are followed by IP counter suits. But often one player in the market is using IP as the general bully, while the other is trying to defend itself.
Many argue the U.S. desperately needs intellectual property reform. But the federal government under both former President George W. Bush (R) and under President Barack Obama (D) has been slow to act.
The Nortel sale should offer a key signal to the market. If the federal government blocks it, it may be a sign that the era of using IP as an offensive weapon is coming to an end. On the other hand, if it’s approved without restriction, it will offer a virtual blueprint of how to defeat your competitor. If the latter scenario plays out consumers may find themselves in an odd market where it’s not the competitor with the best products that wins, but the company with the best lawyers and patent portfolio.
Personally, knowing Google as I do, I think they may well have something up their cuff, watch the space.
Google Realtime search is officially dead in the water with the expiration of the agreement with Twitter. But is it truly finished with the beta testing of Google+ social site going on in the background?
It’s no surprise that when Facebook rolled out their version of search some months ago it did worse than bombed, it was a terrible smattering of Facebook pages somewhat related contextually to your terms. Where as when Google made their first in roads into social with Buzz, they really messed up with pretty much everything where privacy was concerned. Facebook hasn’t really rehashed their search algorithm or modified it to be any kind of a competitor in the search arena, but Google is making a play on their turf.
By all accounts Google+ so far in it’s beta testing is a fairly decent product. With the ability to essentially sort your friends into your own personal groups and the ability to turn the privacy knob up to 11 has the newest offering on some solid ground out of the gate. Google has an immense suite of products already on the table with documents, calendar etc which could even make the social site a place of productivity as well. And with their Realtime search now defunct, having their own social site gives the search giant the tools to use their own posters to fuel that engine. Google+ also has a group video chat they call Hangout, that with some tweaks (rumor says it devours bandwidth) could be a great way to collaborate with friends and even colleagues. Facebook in what could be construed as a response to the Hangout feature released the integration of Skype into the social sites chat features.
At this moment it looks like Google+ beta testing is going to be a solid competitor in the social arena, it just remains to see what they can continue to plug into it. Being able to say, completely migrate all of your Facebook friends into the Google+ site would be a good start.
When you logon to your computer, fire up your browser and start your internet trek for knowledge, entertainment or what ever it is that has your mind occupied, are you going to be able to find your answer? It’s a question which has been gaining more and more traction in the last year or so, and DuckDuckGo, a new start up search engine has been shaking the search cage in an effort to forge it’s own path.
Recently they have put up a page detailing how when you perform a search on Google, Bing or Yahoo, you’re not getting a true results page. The screen shot of the search results clearly shows that different people will receive different results searching for ‘Egypt’ as a search term. Without reading the link text, it’s clear that the results pages are vastly different. But why are they different comes down to dozens, if not hundreds of different reasons. It can be as simple as your location in the country, the time of day or the trend in the news lately. The short pictorial provided on the DuckDuckGo page details essentially how search engines, Facebook, Twitter etc are all delivering pre-packaged results based on your web usage and they also contend that this shouldn’t be happening.
DuckDuckGo is a search engine which doesn’t save your search results, doesn’t pass your search terms onto referred websites, has a nifty red box they call zero click info (handled by Wolfram Alpha) which appears on some searches and after all that, is throwing their hat into the search engine ring. Being a new player at an old game is a tough market to break into, and DuckDuckGo is performing search in a way that is attempting to deliver a filtered *and* unfiltered internet. It’s a noble idea and does have some merit if you’d like to perform somewhat private searches on sensitive matters it may be an alternative for you. Google Chrome and Internet Explorer however both offer a cookieless browser which accomplishes the same result so you don’t really have to give up the engine you know and are familiar with.
The only real way to test if you genuinely live in a “search bubble” is to perform the same search, with 0 clicks on multiple computers. If you begin seeing that your results are significantly different than other peoples then perhaps you have a case. Personally after viewing the screenshots, when you look closely at the how many pages were fetched for each search term, there are tens of millions of pages of difference, so of course the results are going to be different. Part of Google, Bing and Yahoo’s success comes from the fact that they pass some search data to the referred website in the form of the search term, it’s what enabled the search engines to build their ad programs for web users. There are dozens of different variables when you receive your search results after you click that search button and even a simple variable like which data center sends you your results influences your page. If it happens to be running with an index which is a few hours older than others, you can very easily get different results when performing the same search multiple times.
Gene Simmons was in Winnipeg last night on his speaking tour, and we even gave him the key to the city! Before he came to town however, he’d given an interview which touched on a very important business point. The KISS frontman is entirely self made and merchandised the group to super stardom, so it’s safe to assume he knows a thing or two about making a buck.
In a quick interview he made a very important business point, one which all business owners new and old, need to take heed of. “..everybody is really in the same business: The fame business. You have to make your name mean something and people have to recognize your name is synonymous with quality. Your name and your story should precede you.” His response is exactly what we’ve been trying to tell clients old and prospective in the search game, you need to begin branding yesterday because the world is changing today. When you have quality products or services, your name should be on the lips of those in your niche without a hesitation and you should be found online with a simple search of your brand.
Even Google is getting in on the brand band wagon so to speak. Recently images and reports have cropped up on a change to the way the SERPs are being displayed. Typically your title is in blue with a snippet of information and your website url is displayed on the bottom in green. Lately however, Google has been playing with the idea of displaying on the brand name of the result; displaying Facebook instead of http://www.facebook.com as an example.
The time to make your brand is now and the time to make your brand known is now. Making your company brand name synonymous with quality, integrity and worth can carry your company to great success. Or you can sit on a dated website with your yellow pages ads and radio spots, and wonder why the new guy in town is making it rich while your customers slowly flit away.
The founder of the newer engine DuckDuckGo has recently discovered that he’s being hit with tons of spam queries for all sorts of seemingly random searches. He’s made note of the fact that while he can block these botnets from spamming his servers for the same query over and over and over again, he’s formed a question about this traffic.
In his own words from his blog:
“if other search engines include this errant traffic in their query counts. We work hard to keep them completely out because they would overwhelm our real direct queries #s and therefore distort our perception of progress. “
And while Gabriel makes a solid point and brings up a great question as to the quality of the searches and query numbers being generated, I think he’s missing the simplest answer. The founder of DuckDuckGo has managed to block the botnets at the firewall level to prevent them from skewing the query numbers and influencing the search numbers. And being that the other search engines, Google, Bing and Yahoo respectively, have been around far longer, it would lead to the assumption that they’ve already dealt with the issue about the false searches. As far as SEO is concerned, this kind of activity can be seen as a quality spam, as it can be seen as bots that the websites in question have received hundreds of thousands of queries and results from these malicious users. A game and method which was dealt with years ago by both Google and Bing, so it’s almost completely a non issue.
I think the more realistic reasoning behind the botnet traffic on the new search engine is a very simple problem that anyone with a website that has an input box and no validation can relate to. It’s just spam, either looking for an exploit or a kink in the code to be able to exploit the website software that’s been picked up. It’s argued that the small search engines like Blekko and DuckDuckGo offer a better quality of search due to the fact that they are smaller and less bloated than their big brothers. In time however, I can see it being realized that the larger and larger these small engines become, the more increasingly difficult it will be to deliver incredibly fast results (less than half a second) while maintaining a complex directory of hundreds of billions of pages. Google just last year reached the 3 trillion pages indexed mark, a number which would cripple most data centers in existence.
With the addition of the Google +1 button to the social world, an old question has been starting to make advertising agencies take notice again. Is the social ranking element of search, beginning to shape where you show up in the SERPs?
It has been long known that when you “Like” a topic via the Facebook button, you can generate a fair amount of traffic just with a simple click of a button. It’s only recently been entrenched in the Bing results now though that those “Like”s are beginning to shape your personal results pages. When you’re signed into your Facebook account and you perform a search for model racecar in Bing for example. You’ll be able to see mixed within your search results if any of your friends are involved in the same model racing scene as you are. It can create a good deal of traffic if your site is catering on a social level. With the addition now of the Google +1 button, it’s assumed we’ll begin to see the integration of the same types of results in Google as you would see in Bing and the Facebook “Like” button.
Part of the idea is you can determine which of those people on your friends list, you may have more in common with that you didn’t already know. It’s really a personal preference at this point in the game, as you need to be signed in to both services to view your friends likes in your search results. It’s going to be an interesting shift in the search game depending on how heavily your friends connections are valued as opposed to the organic listings as they are presently.
Previous long running CEO of Google Eric Schmidt during a conference yesterday had a lot of thoughts to share about the online world.
Facebook for example, had connections to all of the friends you have, have ever had and even the friends you forgot about. They’re almost all there ready for you to find and become reacquainted with. Microsoft has their finger in the business pie so to speak, as that is their strongest market. Amazong Schmidt shared, is seen as the largest “store front” on the internet and Apple makes pretty things.
For all of the merits he bestowed on his comrades in the online world, he was also quick to add that as strong as they are, one of the companies being discussed was out of the expanding loop of the internet. The giant who just seems to be missing the bus is Microsoft, they just don’t seem to be using the same “platforming strategy”, as Schmidt called it, as the rest of the bunch.
The discussion however, was not limited to Microsofts perceived weakness in the current digital age. Schmidt in a rather candid moment declared, “I screwed up.” Of the laundry list of complaints people have had the world over about the search giant and their practices and procedures, the mistake Schmidt was speaking of was missing the boat to the social party. That’s not to say that Google is a one trick pony of course, just that he missed the social boom so to speak.
Schmidt has since passed the CEO reins to Larry Page, who’ve shifted the companies focus towars social with a very focused vision of becoming a serious player.
If there is anything in the world of search which can change the performance of your site ranking overnight, it’s being tagged as being a malicious site or having malware links on your pages. Searchers get warned when your site is displayed in the SERPs that visiting your site may harm their computer, it’s not hard to imagine that searchers would choose to stay away from your site as a result.
Being flagged as a malicious site or having malware links on your page can happen a number of different ways. In the serious end of things your site and/or server may have been hijacked and your pages could have been rewritten. You could have been picked up with an iframe attack, a clever hacker could have written a code injection on a page comment or link, or you could have just been repoted as such by a jealous competitor. There are a great many ways you can be flagged as a malware/malicious site.
There is a story circulating in the search world today about one such website owners dilemma. Their website, a Yahoo based store, was flagged as being a malicious website contianing malware links on their site. This is a web based business whose CMS is sandboxed after a fashion, by Yahoo controls and Bing has labelled them as malicious. Being flagged is a terrible thing, but seeing as the diligence has been done and there’s been found to be no fault, the owner submitted a ticket for the flag to be removed from their site as it had been applied in error. Now here’s the big problem, after submitting his help ticket and noticing that no change has been made they made a help ticket with Bing. The response they received, has to be no less than shattering; malware re-evaluation with Bing can take anywhere from 3 to 6 weeks to become resolved and for the flag to be removed. When your business is primarily generated through online presence, losing 3 to 6 weeks of business due to an error on a search engines part is devastating to your lively hood.
It’s always warranted to search for yourself online, to ensure you’re placing where you’re aiming and you’re displaying the information you want to be known for. What you definitely do not want to see is the malicious website warning to be tied to your site, as it takes Bing a minimum of a month to remove it, maybe as long as 6 weeks. At least on the upside, Google only needs 24 hours to remove a misplaced tag.
There are some general misconceptions about SEO which crop up from time to time and often come up when going over the process with clients. Some points are extremely valid questions to bring up while others receive ambiguous answers as it changes every day.
Some discussion points like “Why do we need to wait in building back links to our site?” for example tends to come up. To build up quality back links to your website takes time first of all, secondly if you were to go the shady route and buy thousands of links to boost your Page Rank, it’s a very quick way to get the search engines attention. And not in a good way!
“Why should I pay you every month when this other guy says he can do the same for a one shot job?” This is probably the largest misconception about the SEO industry and one of the hurdles which we are met with in dealing with new clients. The biggest reason that you can’t do just a once over and expect the results to carry on forever is because the internet doesn’t shut off. It doesn’t stop, it doesn’t sleep, it’s always changing. And in order to compensate and keep up, the search engines do exactly the same. The change their algorithms, tweak the results and shift the rankings on a weekly, and sometimes daily basis. Upkeep is absolutely essential to remain competitive in search engine optimization and someone telling you they can plant you firmly at the top for a one time cost of $200 is yanking your chain.
“SEO doesn’t seem so bad I’m sure our techs can do it here” This is perhaps the most closed minded statement to be encountered. I’ve written of it here on the blog before, but pick the right horse for the course. When you’re building a new website, contract a web designer. When you’re adding basic information to your site or updating information, use your techs. When you want to bring your brand and website up in the rankings, use a search engine optimization expert. Saying that your tech who does your database scripting will do your optimization for you is basically money lost at best. At worst, they try and shortcut your site and you get kicked from the index for breaking a rule or two.
There’s a new Google search results page being tested out in the wilds of the internet. Varying reports have been given at present, but there seems to be a central theme to the different layouts discussed. The most consistent trend that is being reported in the new pages is.. more white space in the results.
It doesn’t sound like a search game changing shift, but in reality it very much is. There are millions of people using the internet every day do a myriad of things. Searching, playing games, writing stories and blogs or researching who knows what. Almost all users use a 17 inch monitor or larger and the resolution to match. As strange as it may seem, monitor size and resolution also play into the new search results page and how it may affect your search ranking.
Simply defined, white space is literally just the amount of blank space between elements on a web page. By adding more white space to the search results pages, Google has effectively lengthened the page, meaning to get to number 10 of the top 10 results, you have to scroll down on the page. Just as in the real world, location is everything when it comes to search results. If you’re not in the top 10, you’re not betting the views you need to be competitive as a very high percentage of search users don’t click on page 2 let alone page 5 of their search results. It’s been seen in demographics as well, most users don’t even scroll down to the bottom of the top 10 of their searches!
At present most users see the top 5 or 6 on their search results page. If Google were to decide to go live with this change of adding more white space, you would only see the top 4, or 3 even depending on your monitor and resolution. If you were happy and content seeing that you were sitting at number 5 or 6 in your niche, it may be time to take a long hard look at your current site to see if you can kick start some forward gains. The top 3 when it comes to being found is becoming more and more important.