Some experts say search only drives 30 percent of a website’s traffic.
While that may seem insignificant, that 30 percent often makes up visitors who are looking for the products, services or information specific to your company so losing them could have a significant impact on how well an organization performs.
In part one of this series we looked at some things that web developers need to consider when it comes to search engine optimization, but now it’s time to step up to the plate. Let’s roll up our sleeves and really see what it is we can do to help our site compete in the search engine results page.
While we know that our objective is to rank high, we need to understand what it is that prevents a site from achieving that goal, from a developer’s standpoint.
As a developer, the number one thing we need to consider is can a search engine spider crawl my web site? Because quite honestly, if the pages of the site can’t be crawled nothing else really matters.
So what prevents a spider program from finding the right information on our websites?
In the past spiders were not able to read frames, but now almost all the major spiders can read them. If you do need to solve a problem for a particular spider, the quick solution is to utilize your “No Frames” tag content to optimize your page. It is also advisable to make sure that you use a base href tag in your header to help search engines understand better.
Password Protected Pages
These are pages you probably don’t want indexed anyway. Just be aware that like a human, the spider cannot enter any area that is protected by a password.
If you have content behind these pages that you want people to find via the search engines, consider using a teaser with some of the content that can be indexed and protect the rest.
Flash looks cool, it adds interactivity to your site but most of the time it give the search engine spiders some trouble. Recently, Google is reading, indexing and ranking Flash pages based on the text content used in the Flash code. However this is not optimal optimization.
One solution would be to use an entrance page that is keyword rich, create a two frame frameset where one frame is only one pixel in height and use the No Frames area, or to alternate the use of Flash and static HTML.
These are funny because they can be read by some spiders but not by others. If you plan to use an image map, make sure there are other links on the page (perhaps on the bottom) that link to your other pages or better still to a site map that links to all your pages with good anchor text.
These are a popular way to share content but they present a major stumbling block to most spiders. Some engines (specifically Google) are now, however, beginning to index this kind of pages.
Some search engine spiders have problems with dynamic pages which contain variables in the URL. This is most often seen with dynamic pages that use CGI, ASP, or Cold Fusion. Google for instance will not index pages shower URL contains id= followed by more than ten characters or if there are too many variables in the URL. If you are having problems with dynamically generated pages you should consider using the rewrite module of the Apache server to rewrite those dynamic urls into static looking URL or using a similar add on if hosted on a Windows server. There are also PHP scripts which can be implemented which will change the address into a readable page.