Tagged with " internet news"
A search engine seems like a rather simple tool on the surface. Whether it’s Bing, Yahoo or Google as your flavour, when you arrive on their page you’re greeted with an empty text box and a search button. It’s not difficult to work out that with a few keystrokes you can be off and searching for what ever you fancy. But for as simple as it seems, it can be used as a much more powerful piece of equipment, if you take the time to learn what you can do to leverage it’s power in your favour.
Google has been stepping in this year with offering a class on how to use their search engine to discover what you’re looking for. It’s a free, online course whose goal is really just to improve the user experience, both for you and for them. From one of their Google blogs: “The community-based course features six 50-minute classes along with interactive activities and the opportunity to hear from search experts and Googlers about how search works.”
One of the Google representatives likened it to always driving a vehicle in first gear, it will definitely get you where you’re going, but there’s always a faster, better way to do it. The Google search engine offers a whole host of methods to construct a query, whether you want to do basic calculations, learn a time zone or zip code. Even if you’re having trouble describing what it is you’re searching for, you can even drag the image directly into the search bar and conduct your search that way. Add into the couple of tips here all of the conditional arguments you can use ( “” + or – etc) and the possibilities begin to multiply. I guess the question is, how’s your Google-fu?
Yesterday afternoon, for around 4-5 hours GoDaddy suffered an issue with their DNS servers. Some reports have it tied to a hacking attempt which resulted in complete shutdown of (projected) hundreds of thousands of websites. Only GoDaddy really knows how large the affected number is, but where you did lose traffic yesterday, you can rest easy on your rankings and position; at least with Google. John Mueller had this to say about the outage: “In this case (the service outage), assuming it was just the DNS that wouldn’t resolve, we would treat it similar to how we treat other temporary crawling issues, and just retry at some later point.”
There’s been an interesting tidbit of news from the last couple of days surrounding Google, although it’s not in any search decisions or changes they have made.
One of the notions which is being put forward is that Google is in the position of being wary of Amazon and it’s recently announced Kindle Fire. The new tablet that has been rolled out operates on the Android operating system, which they’ve tailored to their own use. Normally this doesn’t make a whole lot of news, there are a handful of tablets on the market of varying quality and some use Android, but the reason the Fire has made a ruckus is because of how it’s been tailored.
Normally when a company builds Android to their use, the default search engine that is used is Google. Amazon, likely because of their business model, chose to forgo using Google as their search provider and instead when you search on the Fire you’re directed to an Amazon search results page. As a user, this would make sense, since you’ve bought an Amazon product, if you’re searching for, or curious about a product, that you’d receive an Amazon page. The reason this is a big deal for Google, is when people are conducting a search to purchase or research a product, it’s a qualified user, so they’re able to serve ads on the results page. The ads that appear on product search pages are worth more to Google, so they make more money. With Fire users being directed to an Amazon results page, they’re skipping that money generator entirely, a complete loss for Google.
What hasn’t been truly answered is what happens if you were to do an informational search, say for some local movie theaters or restaurant bookings. Amazon isn’t so much a local retailer as a global one, so it’ll be interesting to see what gets delivered.
Google is the most widely used search engine globally, and accounts for 65% +/- usage in North America usage. Bing is the rebrand of the Live search engine service, and it’s sitting in a maintain position as of late at around 30% +/- share of the search market place.
Bing has long contended that they have a comparable search service, and some in the search world share their sentiments. But even with their rebrand, television commercials, and with taking over the Yahoo search market, their share remains at a steady third or so of the market. Dubbed as the Bing it On test, it’s a blind survey test which display unformatted, unbranded results and the user decides which results they would use of the two. It’s a testing method that is also known as the Pepsi test, where random people were given a sample of two drinks, and asked which they prefer.
When Bing had tallied the results of their (very small) online sample of 1000 people, they found that the users chose the Bing results at almost a 2:1 ratio. That’s a rather large statistical difference from the current norm with Google dominating the search market share, so why wouldn’t the numbers be the same for current market share? Well for starters the sample number is incredibly small. Using a data sample from 1000 people in the 18+ demographic is like a drop in the ocean, with there being somewhere north of 200 million people in the US alone. If you’re interested in which search engine appeals to you as a user, you can try out their survey for yourself here.
So after talking about the new patent papers filed about tricking spammers into showing themselves, with Penguin and Panda breathing down your websites neck, what is there left for you to do as a webmaster? You follow the basics and stick to the best practices guide, it sounds boring and cliche, but it works.
Focus on content creation, when you put forth effort when creating great pieces of content, the result is increased traffic to your website via shares and referrals. Another strong candidate to let your company and site become well known is to distribute press releases for major changes. Make sure you don’t abuse press releases, as it can give your company a bad image as a spammy site when you talk about every little bit of activity. Focus on taking advantage of press releases and the traffic they generate when you have something impressive to share.
Make sure to take care of the increasing social boom, and add social sharing buttons to your website. Don’t make the assumption that users will take the initiative to share your content on their own. Making it simple for your visitors to share you content with social share buttons is a quick way to garner additional views. Going hand in hand with using social sharing and social media, is to create your own YouTube channel. YouTube is the web’s largest search engine, so when you can create videos showing off your product or services it is a great way to expose new audiences to your brand. And with the search engines starting to include media results mixed in with the organic results, you may even begin to find new visitors coming to your site via your Youtube channel.
And one of the most important tips that can be offered on how to improve your visibility, when you find yourself with a new client you should always do your best to over-deliver on your company’s products or services. Word-of-mouth is still one of the strongest advertising methods online, offline, new media or old. Once you’ve thoroughly impressed a new client, the odds of them bringing their friends and family to your site are very high. Be sure not to step on their toes however, or to ruin a relationship when dealing with a client, as the only thing that travels faster than great news, is bad news.
Previousl I wrote about a Google patent which has gained more and more traction in the search community. The patent in question is named Ranking documents, and as mentioned previously it seemed like an interesting tactic to employ to catch thsoe who less than ideal tactics to rank a website.
Unfortunately, often times you’ll find that people are discussing search engine optimization, and will tack on the terms spamming, or buying links, but the truth of it is – when you do it right SEO is none of those things. The new patent that was discovered I think of like a magic trick, it’s a bit of slight of hand that Google is using to get those unscrupulous tacticians to reveal themselves. Google shows them the index, they try and spam to get ranked highly. When Google notices the spam, they show a different set of results (if you don’t think they can do that, you’re not thinking clearly) just to see the reaction of their target. If they continue to offend, their site will likely be penalized, and the ranking pages will haven’t been affected negatively.
Spamming links to a website to give uncredible weight to a term is a measure used by those in the industry who are in the game for only a few quick clicks. Often used on disposable website urls, they will target a hot, trending term – as an example Paul Ryan’s RNC speech – and will try and garner as many links and as much visibility as possible in as short a time frame as they can. It’s a pump and dump tactic that is very likely to have the offending site banned from the index and the url blacklisted.
I fore see that this patent will be dissected, discussed, scrutinized and blamed for all manner of SEO problems and head aches to come. The only problem that some site owners have has with Google, Bing and Yahoo is they’re getting better and better at catching the people who try to cut corners or use less than natural methods to rank their site. Maybe even one day soon, we’ll have an adaptive algorithm which detects, and removes spam sites as you search actively so black hatters go the way of the dodo bird.
There have been adjustments, changes, and what seems like complete rewrites of the algorithm that Google started with in the beginning. At first when you searched, the results you were given were based directly upon the query as you’d entered it, and sorted by how many back links it had. Now however, when you search for ‘Winnipeg Jets’, images and video for the team appear even though the words “images” or “videos” weren’t in the query.
The algorithm that Google, Bing and a handful of others use, has grown and evolved to a point where it’s trying to anticipate what you’re searching for, as well as the direct query you may have typed. The search engines are getting better at bringing what you want to see on your given topic, and seem to be weighing the number of clicks through to a result as well as all of the previous criteria as well. As you’ve probably mistakenly typed a word or two while searching as well, you’ve probably noticed that search engines are also able to correct spelling mistakes which are commonly made. What the engines are getting much better at doing however, is not correcting your spelling, but interpreting what you may actually be searching for. Google can load dictionaries of how words should be spelled and common misspelled variations of those words and can look at how searchers correct searches and when they click on different variations. And it can use this data to not only suggest a query with a different spelling but to treat the misspelling as a synonym behind the scenes and rank the correctly spelled matches.
And as with what goes in the same basket as interpreting what you might be looking for, Google is noticeably moving forward on trying to discern your intent while searching as well. A basic description of how it works:
Keyphrases don’t have to be in their original form. We do a lot of synonyms work so that we can find good pages that don’t happen to use the same words as the user typed. – Matt Cutts
Even Bing is getting in on the act as if you mispell a common word or phrase, you’ll still often be brough to the correct results. Perhaps soon enough, you won’t need to search by typing, but by simply visiting your preferred search engine. And because you researched a new car and purchased one via online dealership shopping, the engine knows that in a few months it should perhaps deliver you information on local garages which offer oil changes and tire rotation services.
When you’re online, you can always find a survey, or a post, or a questionnaire about almost anything. Recently, the results of one such survey were announced. Conducted by the website SEOMoz.org, they conducted what they called an ‘SEO Industry Survey’ and most of the results are.. predictable, it’s a lengthy read but you can find the full report here.
First off, they collected the usual data about their respondants. Who are you, where are you, how long have you been working online etc. And as per their results, the US accounts for the largest part of the respondants, and there are slightly more female respondents from 2012 – 20.6% to 22.7%. Half of us work as an in house marketer and are 26-34 years old, obviously still a young market at heart.
When it came to the jobs we do, well it seems that the majority of us feel as though we don’t have just one job, although search engine optimization is the number one role we provide (92% response). Followed by providing analytics assistance (82%), link building and content marketing were in the same boat (71%), and coming up faster, likely due to Facebook, is social media management with 70% of respondants saying that was a role they take on.
Not that it should be any surprise, but almost all of us (93%) learn our trade online via websites, blogs, forums etc, while nearly as many, 88%, state that the hands on approach works best. After a sharp drop off to 64% of us reading an actual book, there’s another dip where only half of us attend conferences or workshops in order to learn tricks of the trade. When it comes to plying our trade, almost all search engines will admit that your content is king, so when we’re creating, sculpting, or working on a site, the following chart gives a solid representation as to where we devote our time.
An interesting patent was being discussed lately, which may be related to all of the most recent shifts in Googles index, which seem to shake up the rankings pages every few weeks.
A basic description of how the patent reads:
Rather than allow the rankings to respond immediately and directly to those changes, the system that would change rankings in unexpected, counter-intuitive ways – while the rankings change from a first position through transition positions and to the final “target rank” position.
It sounds a little strange to think that the algorithm would work in an almost backwards way, but it’s an interesting idea. It’s a way for Google to monitor situations where spam may be an issue in a websites position, whether it be through links or content. and as the last portion of the analysis describes,
significant changes in position continue to happen even though there is no change in page’s ranking factors
Happy Birthday to Fresh
It’s been five, eventful years now since Fresh has made it’s landing in Winnipeg, and we’ve steadily become the online leader in the city. We’ve seen the construction of a new museum and stadium, the return of the long lost Jets, and a steadily growing business sector.
In terms of online change and growth, the internet has brought us nearly a billion Facebook users, the introduction of an alternative with Google+ and many changes and updates to the way people access information on the web. The growth of online search, and businesses using their websites to capture their audience has been increasingly growing in Winnipeg. While it was a little bit slow in the beginning to show locals the way, we’re surely and steadily getting there with more and more businesses and people coming online each day. It’s been a great beginning Winnipeg, and we look forward to many more to come.
If you’ve felt a little over run in the last little while by the Google zoo which has been running over the index, it’s a tad sorry to report but we’re not quite out of the wild yet.
Google has mentioned that while the Penguin algorithm shift is targeted at removing/reducing the spam sites in the results, they have also let it be known that it’s still actively being adjusted and worked on. The Penguin update is acting as an adjustment (their word) to the results in removing the backlink value that spammy sites could pass on to those looking at making a quick buck or shift in their positioning. As for Panda, that was the content upgrade in case you’ve forgotten, it is still tuned for digging out poor content on sites and pages. And although the updates are coming more consistently with them being automated, the shock, and surprise that website owners were initially experiencing from positioning drops, have lessened.
Panda is a regular monthly addition now to the search index, and Penguin is being incorporated in much the same way, however it still has a ways to go. With the addition soon to your webmaster tools to handle unnatural links pointing to your website, there will undoubtedly still be some site owners experiencing a shift in rankings if being inattentive with their site. Over time however, it will be a part of the regular indexing, and the results will be cleaner for it.
Matt Cutts on the growth needed of Penguin:
If you remember, in the early days of Panda, it took several months for us to iterate on the algorithm, and the Panda impact tended to be somewhat larger (e.g. the April 2011 update incorporated new signals like sites that users block). Later on, the Panda updates had less impact over time as we stabilized the signals/algorithm and Panda moved closer to near-monthly updates. Likewise, we’re still in the early stages of Penguin where the engineers are incorporating new signals and iterating to improve the algorithm. Because of that, expect that the next few Penguin updates will take longer, incorporate additional signals, and as a result will have more noticeable impact.
Late last week Google announced an additional metric to how it will be handling search results. Starting from last Friday, Google will be taking into consideration valid DMCA requests when parsing the index. While the new portion of the algorithm hasn’t been made live, they did have this to say:
Sites with high numbers of removal notices may appear lower in our results. This ranking change should help users find legitimate, quality sources of content more easily – whether it’s a song previewed on NPR’s music website, a TV show on Hulu or new music streamed from Spotify.
There are a couple of Google owned properties which are notorious for having copyrighted content, specifically Youtube and Blogger. And while they tend to receive the lions share of DMCA requests, Google has said it’s the valid takedown requests which will be used as the metric to decide who should stay, and who should fall. It’s the next major algorithm shift in store for site owners and it’s going to be interesting to see where it takes the content of the web.
Google is taking another page from Facebooks social networking prowess, and will being allowing vanity urls to some select profiles. Currently the majority of the Google+ urls are followed with a long string of numbers denoting your profile, while some are being tidied up.
While the idea is to roll out the feature and vanity value to all of the users, currently they have only passed the cleaned up address to a few on the social network. While it’s a step in a good direction for Google+ social offering, they still have a fair amount of ground to cover in order to catch up to Facebook. A small problem has been picked out currently with the change, as the new vanity urls haven’t been forwarded with optimization in mind. The new addresses are being used as canonical urls as opposed to being a full 301 to pass the full and proper content to the search engines.