Tagged with " internet news"
There’s been a small surge of malware reports coming from the searches via Bing and Google, which really isn’t news in and of itself as they’re always buried within the results somewhere. But what is different, is that more than 90% of image results were found to be malware related on some terms.
The most targeted term this time around happens to be “Emma Watson”, whom McAfee has named their most dangerous celebrity search of 2012. Of the two engines, 30% of Googles searches had malware warnings attached and more than double that came out of Bings results. Malware take overs happen in a couple of different ways, one of the most frequent are websites built with little to no security written in, and then there are throw away websites and urls used purely for the spread of malware on the web.
Black hat SEOs typically go after the hottest search terms and poke around the web looking for websites which have loop holes in it’s security. They actively work to hijack the website and it’s url, to help lend false authority to what ever term they’re wanting to spam. And because uneducated or hasty users tend to automatically trust the top results in the search engines, the spread of malware will continue.
Because of the recent discoveries that image results are getting slammed with malicious results, where the text results pages are beginning to be left behind, Bing has been unofficially dubbed (currently) the most poisonous search engine. The only reason that the moniker has been attached to the search service is due to the recent report about malicious websites being targeted at image searches now as opposed to the text results pages. Not to fret however, as Bing and Google will take steps to close those holes which have been opened in the image results, and in the meantime just be a little more cautious before clicking that top image of your favorite star.
There’s another minor change coming shortly with Google Adwords campaigns, it’s not exactly something new, but it’s a change that will affect those who are less diligent in the campaigns.
Google has always been a stickler for content, no matter how you feel about the company or the way it conducts it’s search business, that’s always been the case. With the trillions of web pages out there, sometimes the job of finding the most relevant results per search can be difficult, but they do their best. The advantage they have with the Adwords platform however, is they’re able to monitor if you’re actually following the rules they laid down. One of the bigger rules that the search engine has finally decided to become tough on, is just where your ads point too. From the Adwords blog:
Our existing policy has required each sitelink in a campaign to link to a different landing page. That means a user would have a meaningfully different experience on the landing page from each sitelink.
The reason that the company is going to start laying the smack down? They’ve done some checking and noticed that multiple ads are all heading back to the same page, a no-no in their guidelines.
What is going to happen moving forward is basic, initially the Adwords team will focus primarily on the new ads and campaigns that are created, ensuring that the landing pages are unique. If your link matches an existing page, your campaign will only list one of the two, the other will be restricted from showing up. The idea of the enforcement is obvious, they want to give everyone the chance to appear on the results, the best way to do that is to make sure everyone is following the rules they’ve set out. Now don’t fret, and exclaim that you need to tear down all of your campaigns and rebuild them from scratch. The folks over on the Adwords team realize that this isn’t a small effort to take care of your sitelinks so enforcement for existing links will be a few months out. This will affect all new sitelinks going forward at first.
Canada is still really coming into it’s own online, and perhaps Google has come up with the right idea to encourage some growth. The Google eTown award is designed to recognize those towns where small businesses are investing in online tools and resources to find new customers, grow their business, and improve their operations.
Chris O’Neill, Managing Director of Google Canada
“We’re delighted to recognize the accomplishments of the small businesses embracing the web in each of these cities,” said, we know that the Internet is going to contribute massively to Canada’s economic growth..”
We’ve been saying for a few years here at Fresh, that online business is still coming into it’s own for all us Canucks, hopefully this new incentive will help spur things along. They’re not playing any favorites, and have divided the country up into 5 zones, Atlantic, Quebec, Ontario, Prairies, and BC & North. This year the 5 winners are: Moncton New Brunswick, Dorval Québec, Parry Sound Ontario, Canmore Alberta, and Duncan British Columbia.
Across each of the winning towns, business has discovered that the Internet became a vital tool for reaching new customers and engaging with existing ones. In recognition of their eTown status, Google Canada will be presenting the awards to each city during events throughout October. Local businesses will be invited to attend and meet Google experts who will be on hand to provide advice on growing your business online. Actively working on your brand image is one of the key initiatives that you need to pursue while growing online. Being able to translate your offline business into your online image is one of the goals you should be aiming for, and when you’re ready for that step, Freshtraffic is here to help.
There’s been a couple of Google updates in the last week, which has had a visible change on the results pages. One of the updates was fairly simply, it was a Panda update which according to the available information impacted less than 1% of searches. And the other update was in how the results page returns your search results.
The Panda update was another in a long string of updates designed to improve the overall search quality, and with the change affecting such a small sample if you’re following the rules then you shouldn’t perceive any issue. If you’re the numbers type of person, you can compare the less than 1% affected from this last update, to the 12% affected when Panda made it’s initial break out onto the web. One of the points to bear in mind as well with this news coming straight from Google, there may be some surprises in store under the hood of this latest update.
Maybe the extra they put into the Panda update has something to do with the tweet from Matt Cutts earlier this week?
Just fyi, we rolled out a small algo change this week that improves the diversity of search results in terms of different domains returned.
Diversity in the search results is a great growing point, as it gives more opportunity for optimization to show it’s mettle. It’s possible, to have an entire first page of results completely dominated by a single website, if your optimization is that strong and your competitors are that weak. While the change was added only this week, reports are that it hasn’t yet been noticed in the bulk of the results.
And it’s a bit of a throwback to the beginnings of the web, but according to a recent posting, the keywords tag has returned. According to Google News Product Manager Rudy Galfi:
The goal is simple: empower news writers to express their stories freely while helping Google News to properly understand and classify that content so that it’s discoverable by our wide audience of users.
Similar in spirit to the plain keywords metatag, the news_keywords metatag lets publishers specify a collection of terms that apply to a news article. These words don’t need to appear anywhere within the headline or body text.
A small difference in the way the news_keywords tag works, is you’re limited to 10 terms, likely to stop users from cramming as many terms as they like. It’s a new way to use an old tag, and it’s going to be interesting to see how the use plays out.
It’s been long clear that a search engine is a search engine, and there are a handful online that receive the majority of the traffic out there. Google, Bing and Yahoo are the usual methods which are used to search, with other sites like Blekko, Duckduckgo, and Ask also being used by those desiring a different experience. Facebook however, is one of the largest websites online, and with what is approaching a billion users, the world has been waiting to see if Facebook is going to try and enter the search arena.
Ever since Facebook has gone public, the stock has been a sort of tepid pool, with no real revenue model the online mutterings often go to the topic of a search engine. And when Mark Zuckerberg throws around statements akin to “Facebook is doing a billion searches per day without trying” the mutterings pick up some volume. In a recent Techcrunch interview, Zuck made it clear that the company realizes that there is a huge opportunity for a search engine with Facebook, but tempers that by also talking about how the way that users search is ever evolving.
“Search engines are evolving” to “giving you a set of answers,” Zuckerberg said. ”Facebook is pretty uniquely positioned to answer a lot of questions people have. At some point we’ll do it,” he went on. “We have a team working on search.”
There are some real concerns about just how Facebook could leverage their massive user base as a search engine however, and it has less to do with spidering capabilities than it does privacy. The system as he briefly described and envisioned, would mean taking the opinions of your friends, family and contacts and trying to form a result for your query. Searching for terms like ‘best burger’ or ‘new batman movie reviews’ wouldn’t necessarily be informational, but would deliver you a list of opinions from your contact list. At any rate, it’s not happening today or tomorrow, or even soon for that matter. But (if) when it does, it will be introducing change into the search landscape and the online experience as a whole, and change is very, very good.
A search engine seems like a rather simple tool on the surface. Whether it’s Bing, Yahoo or Google as your flavour, when you arrive on their page you’re greeted with an empty text box and a search button. It’s not difficult to work out that with a few keystrokes you can be off and searching for what ever you fancy. But for as simple as it seems, it can be used as a much more powerful piece of equipment, if you take the time to learn what you can do to leverage it’s power in your favour.
Google has been stepping in this year with offering a class on how to use their search engine to discover what you’re looking for. It’s a free, online course whose goal is really just to improve the user experience, both for you and for them. From one of their Google blogs: “The community-based course features six 50-minute classes along with interactive activities and the opportunity to hear from search experts and Googlers about how search works.”
One of the Google representatives likened it to always driving a vehicle in first gear, it will definitely get you where you’re going, but there’s always a faster, better way to do it. The Google search engine offers a whole host of methods to construct a query, whether you want to do basic calculations, learn a time zone or zip code. Even if you’re having trouble describing what it is you’re searching for, you can even drag the image directly into the search bar and conduct your search that way. Add into the couple of tips here all of the conditional arguments you can use ( “” + or – etc) and the possibilities begin to multiply. I guess the question is, how’s your Google-fu?
Yesterday afternoon, for around 4-5 hours GoDaddy suffered an issue with their DNS servers. Some reports have it tied to a hacking attempt which resulted in complete shutdown of (projected) hundreds of thousands of websites. Only GoDaddy really knows how large the affected number is, but where you did lose traffic yesterday, you can rest easy on your rankings and position; at least with Google. John Mueller had this to say about the outage: “In this case (the service outage), assuming it was just the DNS that wouldn’t resolve, we would treat it similar to how we treat other temporary crawling issues, and just retry at some later point.”
There’s been an interesting tidbit of news from the last couple of days surrounding Google, although it’s not in any search decisions or changes they have made.
One of the notions which is being put forward is that Google is in the position of being wary of Amazon and it’s recently announced Kindle Fire. The new tablet that has been rolled out operates on the Android operating system, which they’ve tailored to their own use. Normally this doesn’t make a whole lot of news, there are a handful of tablets on the market of varying quality and some use Android, but the reason the Fire has made a ruckus is because of how it’s been tailored.
Normally when a company builds Android to their use, the default search engine that is used is Google. Amazon, likely because of their business model, chose to forgo using Google as their search provider and instead when you search on the Fire you’re directed to an Amazon search results page. As a user, this would make sense, since you’ve bought an Amazon product, if you’re searching for, or curious about a product, that you’d receive an Amazon page. The reason this is a big deal for Google, is when people are conducting a search to purchase or research a product, it’s a qualified user, so they’re able to serve ads on the results page. The ads that appear on product search pages are worth more to Google, so they make more money. With Fire users being directed to an Amazon results page, they’re skipping that money generator entirely, a complete loss for Google.
What hasn’t been truly answered is what happens if you were to do an informational search, say for some local movie theaters or restaurant bookings. Amazon isn’t so much a local retailer as a global one, so it’ll be interesting to see what gets delivered.
Google is the most widely used search engine globally, and accounts for 65% +/- usage in North America usage. Bing is the rebrand of the Live search engine service, and it’s sitting in a maintain position as of late at around 30% +/- share of the search market place.
Bing has long contended that they have a comparable search service, and some in the search world share their sentiments. But even with their rebrand, television commercials, and with taking over the Yahoo search market, their share remains at a steady third or so of the market. Dubbed as the Bing it On test, it’s a blind survey test which display unformatted, unbranded results and the user decides which results they would use of the two. It’s a testing method that is also known as the Pepsi test, where random people were given a sample of two drinks, and asked which they prefer.
When Bing had tallied the results of their (very small) online sample of 1000 people, they found that the users chose the Bing results at almost a 2:1 ratio. That’s a rather large statistical difference from the current norm with Google dominating the search market share, so why wouldn’t the numbers be the same for current market share? Well for starters the sample number is incredibly small. Using a data sample from 1000 people in the 18+ demographic is like a drop in the ocean, with there being somewhere north of 200 million people in the US alone. If you’re interested in which search engine appeals to you as a user, you can try out their survey for yourself here.
So after talking about the new patent papers filed about tricking spammers into showing themselves, with Penguin and Panda breathing down your websites neck, what is there left for you to do as a webmaster? You follow the basics and stick to the best practices guide, it sounds boring and cliche, but it works.
Focus on content creation, when you put forth effort when creating great pieces of content, the result is increased traffic to your website via shares and referrals. Another strong candidate to let your company and site become well known is to distribute press releases for major changes. Make sure you don’t abuse press releases, as it can give your company a bad image as a spammy site when you talk about every little bit of activity. Focus on taking advantage of press releases and the traffic they generate when you have something impressive to share.
Make sure to take care of the increasing social boom, and add social sharing buttons to your website. Don’t make the assumption that users will take the initiative to share your content on their own. Making it simple for your visitors to share you content with social share buttons is a quick way to garner additional views. Going hand in hand with using social sharing and social media, is to create your own YouTube channel. YouTube is the web’s largest search engine, so when you can create videos showing off your product or services it is a great way to expose new audiences to your brand. And with the search engines starting to include media results mixed in with the organic results, you may even begin to find new visitors coming to your site via your Youtube channel.
And one of the most important tips that can be offered on how to improve your visibility, when you find yourself with a new client you should always do your best to over-deliver on your company’s products or services. Word-of-mouth is still one of the strongest advertising methods online, offline, new media or old. Once you’ve thoroughly impressed a new client, the odds of them bringing their friends and family to your site are very high. Be sure not to step on their toes however, or to ruin a relationship when dealing with a client, as the only thing that travels faster than great news, is bad news.
Previousl I wrote about a Google patent which has gained more and more traction in the search community. The patent in question is named Ranking documents, and as mentioned previously it seemed like an interesting tactic to employ to catch thsoe who less than ideal tactics to rank a website.
Unfortunately, often times you’ll find that people are discussing search engine optimization, and will tack on the terms spamming, or buying links, but the truth of it is – when you do it right SEO is none of those things. The new patent that was discovered I think of like a magic trick, it’s a bit of slight of hand that Google is using to get those unscrupulous tacticians to reveal themselves. Google shows them the index, they try and spam to get ranked highly. When Google notices the spam, they show a different set of results (if you don’t think they can do that, you’re not thinking clearly) just to see the reaction of their target. If they continue to offend, their site will likely be penalized, and the ranking pages will haven’t been affected negatively.
Spamming links to a website to give uncredible weight to a term is a measure used by those in the industry who are in the game for only a few quick clicks. Often used on disposable website urls, they will target a hot, trending term – as an example Paul Ryan’s RNC speech – and will try and garner as many links and as much visibility as possible in as short a time frame as they can. It’s a pump and dump tactic that is very likely to have the offending site banned from the index and the url blacklisted.
I fore see that this patent will be dissected, discussed, scrutinized and blamed for all manner of SEO problems and head aches to come. The only problem that some site owners have has with Google, Bing and Yahoo is they’re getting better and better at catching the people who try to cut corners or use less than natural methods to rank their site. Maybe even one day soon, we’ll have an adaptive algorithm which detects, and removes spam sites as you search actively so black hatters go the way of the dodo bird.