Article By R. L. Passman
optimization is a tool for improving both the quality and the volume of
traffic to a website based on natural or organic search engine inquiries.
Natural or organic searches are those that come from entering a search
parameter into a search engine such as Google or Yahoo. The results returned
will include both paid and algorithmic (unpaid) entries.
optimization, or SEO,
seeks to drive the natural or organic returns to the front of the line so that
a site that is well optimized will return results on the first 2 to 3 pages of
One can dramatically improve one's marketing efforts through website exposure if one has a reasonably competent understanding of how search engine algorithms work and how human beings actually search in terms of keyword search patterns. SEO seeks to optimize site coding, structure, keyword choice, title and description content, and page content. The trick is to optimize both code and the human interface with the site in a way that is transparent to the human being yet is also clear to the spider robot that crawls the site looking for information. Because humans and robots expect different things, the task of optimizing a website is always a tightrope act, balancing the possible with the technical. Well developed SEO makes the site appealing to the humans that visit and easily indexed by the spiders that blindly crawl the web looking for things to do in a mindless manner.
I like to think of the search engine spider as just another visitor to a website, one with some very specific criteria to look for in order for it to do its job. As a result of treating the spider as another visitor, I generally design two websites. The first is a site designed in code. This site is specifically developed to accommodate the spider that will crawl the site with some frequency. I want the spider to index the site in an orderly manner, be able to recognize the structure of the site, find all the pages I want indexed, and generally recognize the relevant keywords I want to target. I realize the spider only sees code. It is unable to recognize fancy illustrations, flash animation, pretty pictures and award winning layouts. No, the spider sees code, and then sees a limited set of the whole code to boot. So writing code for the search engine is one important website to develop.
The second website is the one people see. It is the one with navigation links, pictures, illustrations, contact forms, pleasing colors and the like. Of course, this site is also built on code, the very same code I wrote for the spider, only this code translated into something appealing to the human viewer. Like many designers, I keep content code separate from layout code. The spider sees the content code while the layout code influences what the human viewer sees. So I write content using HTML and layout using CSS. In this way I keep the project separated yet integrated. The result is that each user of the site gets what it needs in terms of site appeal. Fighting with the search engine is a losing battle. Understanding the search engine as a visitor yields positive results for both parties.
The well developed website will take into consideration what the search engine requires and write code accordingly. The same site will also use style sheets to create an appealing interface with the human user of the site. This is a classic win-win and is almost guaranteed to improve both quantity and quality of website visits for your company or organization.
Roger is a principal at RNS Design Group, an internet design firm specializing in small businesses and organizations. Through RNS Domains he offers Linux or Windows website hosting services, domain name registration and more.
Article Source: http://EzineArticles.com/?expert=R._L._Passman