Browsing "internet strategy"
In continuation from yesterdays post about prepping for the new year, and being ready to take the bull by the horns online, below you’ll find a couple of tips to be aware of in regards to getting ready. An insight or two, as well as some advice to bear in mind as your advertising year starts anew.
An easily forgotten fact, is that web crawlers from the search engines treat each individual page of a website as it’s own entity. This means that rather than regarding the optimization of your website as a whole, it can be more productive to think of it as the optimization of each page. Using this approach is much more involved and meticulous care needs to be taken, but, once all of your pages have been optimized, your entire site will be search engine friendly, and loaded with relevant content.
An important SEO fact every business owner should know is that true, organic SEO, requires a multitude techniques, and time for those techniques to work if an SEO campaign is really going to be successful. Using a single SEO method may help improve your results a little; but if you want your business to be ranked highly by the search engines for a long time and to increase the amount of traffic that visits your site then proper identification and implementation of acceptable SEO techniques will be required. It is also important to note, that rankings can, and do, fluctuate day to day within the search results page. It’s impossible to guarantee the number 1 spot in the results, but when Fresh works your site, with our proven organic optimization techniques, you will climb the results in ranking.
The third basic fact in search engine optimization that you should always have in mind, is that SEO is a continuous process. It does not stop, because there’s always another website, another business, and/or another competitor knocking on your door online; trying to take your place. Once high rankings have been achieved through an efficiently run SEO campaign, if this optimization is suddenly halted then it is likely that the rankings achieved will drop too. You could liken the effect much the same as watering a flower garden properly and with care, to suddenly stopping. All of your efforts in raising that product, will be for nothing. Google, Yahoo, and Bing are constantly changing and developing their technology and new techniques to help be ranked are also constantly being developed. If you want to remain on top, then you must remain updated on all developments and make the necessary changes to your SEO campaign.
At Freshtraffic, we’re constantly monitoring our clients sites, and if a slip in ratings is found consistently, and not a momentary lapse in a reindexing, we will retool the site to return to, and retain the previously held high ranking.
As a business owner, website owner, whether you’ve been online for a decade or are only stepping into the world wide market place; there’s an ever present question: to dot com or not to dot com? Lets take a look at some of the differences.
1. Clear geo-targeted name
To own a local domains, you actually need to go and buy them and register with a local authority. Because of this, local domains have always represented the best controlled and strictest identifier of a specific geography. There are exceptions of course, but these mostly have to do with certain domains, such as .tv (the tiny island state of Tuvalu) having found that their particular geography had a gold mine domain name it could use to generate revenue.
In other words, if the site was a French site, operating under a .fr domain, within hours of a search engine crawl, the site would show up in the area called “Pages de France” or pages from France—even if the site was actually hosted in the US.
2. Solid site architecture
The argument is often put forward that it’s too expensive to switch an existing dot com website with zillions of pages over to the relevant local domains its owners wish to target. It can, of course, be expensive to switch the domain used and this needs to be done with great care. However, when the cost of making the change is calculated, business will tend to find less financial value to the ongoing cost of SEO to compensate for not having the relevant local domain. This could mean additional local hosting costs or even substantial link building to overcome the disadvantages of the dot com. However, every businessman should have “going local” as an ultimate part of their long term plan.
3. People are inclined buy locally
Some SEOs may not see conversion factors as the most important in recommending which steps a client should take. Some users however, read URLs in the search engine results and that it can have a direct impact on how many of them click on links. Say you’re looking for a “second hand car” and you live in Canada. If you know nothing else about a website, which is most likely to be the most compelling: “secondhandcar.com” or “secondhandcar.ca?”
Even beyond the search results page, the local domain vs dot com plays in the mind of the user. “If this is a .ca and I live in Winnipeg, then they’re more likely to deliver” is a reasonable conclusion for most folks to draw.
It’s important to understand that there are no “direct” SEO benefits to hosting on a dedicated server. In other words, search engine spiders don’t check if your website is hosted on a dedicated server in order to give boosts or penalties based on what they find out.
There are quite a few indirect SEO benefits associated with dedicated servers though. First of all, you don’t have to worry about your website being in a “bad neighborhood”. The main problem with shared hosting is that a lot of websites, sometimes even thousands, end up being hosted on the same server/IP.
Unfortunately for the people who only employ legitimate approaches to search engine optimization, not everyone is willing to play fair. There are webmasters who are always on the lookout for the fast way to success and it’s safe to assume that out of hundreds or thousands of websites hosted on a certain server, the chances are that at least a handful may be “problematic” from that perspective. In other words, even if you do everything right, there’s a chance that you’ll have to pay for other people’s mistakes.
There’s also the issue of page loading and page refresh speed which needs to be taken into consideration. Algorithms are constantly changing, and some search engines take page load times into consideration; at least to a certain degree. It makes perfect sense if you stop for a moment and think about what search engines are meant to do. Search engines all want to display the best possible websites when someone searches for a certain term and fast page load times are important when it comes to the experience of your visitors. Of course, there are other factors which also are taken into consideration and most of them have more weight than page load times. But still, having an edge, however insignificant it may seem, is always a good thing, especially if you’re targeting terms which are extremely competitive.
But, the most important thing that dedicated servers put on the table is complete control. It’s your server and you can use it as you please. This means that you have more control over the software that you’re using and that you can tweak absolutely everything.
It’s the largest, most widely used search engine out there, you know it well, as it’s basically everywhere. Google. At this point in time, the holy grail for businesses either already existing online, or thinking of it, there is an incredible laundry list of items, to do, not to do, and in between all the white and black exists the grey.
I’m going to talk, just very briefly, about some of the black, or the methods which will (most likely) end up with your site BANNED FROM GOOGLE!
Redirecting to another domain:
Redirecting to another domain is not a 100% guarantee that you’ll be banned from search engines. But it is a very common spammer trick used in conjunction with doorway pages and cloaking. If you set up a redirect that goes to a new domain, you need to write it as a 301 redirect. This tells Googlebot that the redirect is permanent, and they should change the listing to the new domain.
Invisible or Single Pixel Images:
Invisible images are images that are 1×1 pixels in size and cannot be seen by the naked eye on a Web page. This is similar to hiding text or displaying different content to search engines than to your customers. And don’t assume that search engines can’t read CSS or HTML tags that resize full-sized images. If you do this to optimize your pages, your site will be banned.
Hidden or Invisible Text:
Hiding text by making it the same color as the background color may fool your customers, but it won’t fool search engines. Another variation of this is where you make the font size so small that it’s unreadable by the naked eye. Text that is hidden from your readers but visible to search engines is considered spam and will get your site banned.
It can be very tempting to use cloaking, but while it might give you better results at first, search engines don’t like it. Search engines want to provide a resource of information that is real, not something that has been doctored to give artificial results. This is often called cloaking because it is an effort to hide, or cloak what your site delivers in something that might be seen as more palatable to search engines.
These are some of the basic black hat SEO techniques out there, all are easily noticed even by a person browsing the net, now that you know what to look for. Ideally of course, these factors are paramount to keeping in mind when choosing your SEO expert.
Step 1: Don’t worry. This occurs regularly, and can cause a lot of movement in rankings, meaning that it’s come to be feared by many in the SEO industry and anticipated by others. The update isn’t just one sudden switch, though, as each index update takes several days to complete. During this update the searches seem to ‘dance’ between the old index and new index – that’s the Google dance.
So why does it happen? Well, Google pulls its results from thousands of servers, and they can’t all be updated at once. Instead, each server is updated with the new index, one at a time. This can cause very strange behavior in the page rank process if two major sites located on separate servers happen to have a close linking bond. These sorts of separations are interesting and can contribute to a great deal of change and motion in page ranks. The most important thing to keep in mind is that eventually Google will rank you where you belong. Generally, if you behave, you will not be thrown around for long by the odd activity that can occur when Google is in the process of updating its index.
One common misunderstanding is the idea that Google controls which server each kind of information is coming from, and so stores similar information on the same server. Google’s index doesn’t work this way – it’s a big, disorganized mass of information that Google searches very quickly. This is a blessing in disguise because it allows your site to remain reachable via other sites that are related to it when the index is taking place. Your site generally won’t suffer for too long when an update is taking place anyway, but if you are heavily dependent on Google results, you will see a slight drop for a short period of time. This drop is often followed by a slight spike especially if your page rank has increased since the last round of indexing.
The servers that Google uses are distributed between datacenters all over the world. Google doesn’t keep all of those eggs in one basket – they want to be able to lose one datacenter and have the rest survive. This way, if part of Google goes down, people can still use the search engine and as was said before, this allows your site to be accessed via related sites if the server holding your sites index happens to go down. The datacenters that Google has put into play are enormous in comparison to most datacenters around the world. Google rivals some of the largest datacenters in the world with each of its datacenters and is probably the largest in the world if all were combined into one.
You see, the ‘time-to-live’ for Google is only five minutes – that means that Google can change it’s IP address every five minutes. This allows them to switch between their datacenters regularly, spreading the search load between them intelligently and routing around any damage. If you constantly entered the same datacenter with every search that you placed it would almost certainly fry within twenty four hours. Considering the number of users on Google each and every day, it is surprising that thn servers they do have is enough. A server can only handle so much traffic in a day and Google insures that it can hold more than any other service on the internet.
The datacenters updating their indexes at different times causes Google to do its dance. Unless you’re looking for your website’s ranking, you’d never notice this as your site is normally available at all times. The unfortunate bit is that often times you will lose your page ranking for a short period of time or your site will seem to have a lower number of pages indexed by Google. If you insure that you have several hundred pages available on Google at all times you will most likely be able to provide all of your content at all times either directly or indirectly.
Step 2: There is no step 2
Back to the basics again. We’ve talked about building, some basic tagging, and basic linking strategies. Another very basic step in the realm of SEO for the layperson, would be to make use of the current social media craze.
Facebook, Twitter, LinkedIn, Blogger/Wordpress, and to a lesser degree, Myspace, are all very viable, and encouraged spaces to use to market yourself and your new website.
Facebook offers a great audience, and is one of the most widely used social networks out there. Using their free services to help boost your sites visitors is as simple as having friends and family add you to their account, and just spreading the word. Use the page to update your news frequently, and watch the visitors begin to use your site and services.
Twitter, when used in conjuction with Facebook, adds an additional dimension to your company’s profile. Being able to update your news, interact with any customers or clients in an almost instant medium, is a great draw for potential clientelle. Being that Twitter can be done anywhere, you could be on the road, at the office, or at home and be able to handle any questions or concerns which can impact your image.
LinkedIn, is a professionals version of Facebook. It can be used to begin building your professional image with established companies locally, and globally. Lenders, suppliers, potential partners and alliances can all begin as simply having an account with LinkedIn.
Writing blogs is like icing on the cake. It’s the little bit of everyday, active information about your company, your niche, or even just the industry around you. Taking a short time each day to send out a blog, gets your blog site indexed within the day, and over time, and with dedication, you’ll begin to see the visitors grow and traffic will go through your site.
More to follow…
With the recent influx of SEO Experts, there’s been a flurry of activity in the industry,both nationally and locally. Website design being evaluated, online presence (notably lack there of) is being felt by businesses in Winnipeg, and in Canada as a whole.
The idea that Search engine optimization is a required expense moving forward in a successful business has slowly been seeping into the collective business mindset. It’s apparent to the true experts in the field, and when you see half hearted attempts by those trying to wedge their way into the niche, it’s a time to just shake your head.
Some of the gems found in just an afternoon of browsing::
Poorly optimized websites from the “experts”
Blogs/Blog sites setup and used once or twice
Companies guaranteeing position
“One time SEO” websites
The majority of the above examples came from web design company websites, who’ve decided that they are ready, and capable to jump into the deep, dark sea of Search Engine Optimization(SEO)/Search Engine Marketing(SEM).
There are a millions of pages on the web packed with information on SEO, on some of the general guidelines, and the pros and cons of the marketing avenue. But, for every page out there, there are 2,3, 10 pages of mis-information as well.
In the end, it’s best to remember: SEO/SEM is a niche industry, horses for courses, and the real pros don’t just publish all of their tips and tricks. Their results, speak for themselves.
Magic! Mysticism! Mystery!
Nope, guess again. Search Engine Optimization, is to a lot of people, full of magic and mystery. It has none of these qualities, yet when it’s done right, it can definitely provide the illusion of it.
Proper SEO, can change a companies future, drive internet marketing and sales, give that needed visibility for your campaign race, and more! When the rules are followed, and the steps are taken, it can be magic.
That being said, there’s the black magic out there of course. The not so legal tricks, cloaking, hiding text, keyword stuffing, just to name a couple. The temptations for fast results, often come with the just rewards, barring of your site, to outright banning your site from the index.
In the end, the good guys win, and the bad look for a new way to use their tricks.
Which hat will your hero wear?
There are simply too many benefits to using CSS (cascading style sheets) to ignore. Is your page code long and difficult to read? Is it increasingly harder to find that line to tweak your pages layout or appearance? Cascading Style Sheets will help improve your websites efficiency, and it will be easier to read for search engines. To a search engine, a website that is coded in CSS says several things about your company: you embrace foward technology and that your content is modern, up-to-date and relevant. Using Cascading Style Sheets is one of the best Search Engine Optimization methods to creating a web site, it is candy to Search Spiders.
A Cascading Style Sheet is a method used to give web designers more control over the style of each page — allowing them to customize each paragraph, title, header, or bulleted list. CSS gives you the ability to specify font, size, and color of anything you want without having to repeatedly insert specific lines of code.
A few key points to using CSS to help optimize your website are:
1. Using regular HTML codes to define things like font tags can be very cumbersome. It inserts extra unnecessary code into the source code of your website, making it more difficult for the spider to crawl your site. Search engine spiders won’t just read the content that shows up on your site, they read every single line of code. You don’t want to pollute your source code with excessive HTML.
2. In a nutshell, CSS can do everything HTML can do, only better. Instead of placing the style tags (for fonts, backrounds, colors, etc.) in the actual code, it is placed in an external file. A simple and brief statement is placed in the source code to call to that external file where all your bulky style codes are laid out leaving your page clean and simple.
3. The use of Cascading Style Sheets will also allow preloading of your website into your browser. The basic elements are carried forward through any page you visit. This is optimal for search engine spiders, they want to read through your website as fast as they possibly can. With CSS, you can have all the effects you want without slowing down the search engine spiders you are striving to impress.
4. Give your website a diet plan! Converting your old HTML layouts to CSS can typically cut anywhere from 25 to 50 percent off of page size. They give you the added benefit of simplified, adaptable structure and the separation of code in a presentation.
5. One of the most instantly rewarding benefits of Cascading Style Sheets is that they are a huge time saver. Code it properly once, and never again. Let’s say for example, your site is coded with HTML; the header on each one of your sites pages is written in blue text. One day you decide you want to change the look and feel of your site and you want to change the headers from blue font to a bold black. With an HTML website, you have to go through each page and physically change each header. Depending on the size of your site, that could be very time consuming. with a website written with CSS, you simply open that external file, change the header tag color from blue to black and your job is done That one change will automatically update every header without you having to manually change each one.
What would we do without Google?
Google’s motto to “organize the world’s information to make it universally accessible” is perhaps the one defining characteristic that we can rely on.
The fact that we can debate whether Google lives up to its other motto – “do no evil” – means that we cannot really rely on them to be virtuous.
But we can rely on them to organize the world’s information, and they’ve done that fairly well. At least, theyve done it better than anyone else. If it meant being evil to do that successfully, would be willing to live with the evil?
I think we could advance the argument that most people who consider Google the evil empire probably do so on the basis of some perceived action by Google that has been unfavorable to them or someone they know. In other words, if you operate a website and youve been de-cached or your Google AdSense account was discontinued, etc.
But Google doesnt just penalize people without cause. There is usually a violation of policy somewhere, and if there wasn’t then I’d bet my last dollar that Google would reinstate the individual to favorable status if it was brought to their attention.
That doesn’t mean the search engine hasn’t, and won’t, do evil things. It does mean that they take their mission to organize the world’s information seriously.
So I ask again, what would we do without Google? It was Google who taught us that back links count. It was Google that brought us the wholesale opportunity to advertise our businesses and choose what we are willing to pay for a lead.
It was Google that made us realize that search engine spam can be tackled at the source and while they aren’t perfect at defeating it they do a nice job.
I know that Google is constantly seeking improvement. Sometimes they fail and institute policies that backfire or that don’t work. But they are organizing the world’s information pretty well and I know that if I need to know something really important I can search Google and within minutes have that pressing question answered. Without Google, I’d be living at the library.