Approach the idea of search engine optimization (SEO) through the eyes of a site user following these guidelines:
Provide substance. Create articles and text that a surfer will actually read. Webmasters will notice they use keywords naturally in their writing when explaining particular topics. Create original content and avoid RSS feeds and purchasing content that is used on many other websites. Search engines will see the same text on multiple servers and can penalize webmasters regarding their relevancy. Original text will give repeat surfers something new to read and come back to for more.
Update frequently. The more often a site is updated, whether it's weekly, daily or more, the more it will be noticed by search engine spiders. By watching log files, webmasters can track each time Google's Googlebot hits their sites and how many times it returns each day. Add additional content the next day, and webmasters can watch their weblogs for the spidering visitors. It will soon be apparent that their sites are being indexed more frequently due to the increasing amount of new content.
Label properly. Use descriptive and accurate text for the [CODE]
Prefer static pages. Pages ending in ".html" are best. ".asp and " .php" pages can be indexed, but pages that tack on session IDs and additional parameters that follow the "?" mark in the URL could give webmasters trouble. If a site features a link to view or print a printer-friendly version, it will appear to search engines that there are two copies of the same text. Use a "robots.txt" file to block access to the printer friendly version, prevent indexed duplicates and avoid a potential webmaster penalty.
Get good inbound links. Sites will be penalized for receiving links from "bad neighborhoods" or websites flagged by Google as search engine spam. Webmasters must research and evaluate every website that they want to be linked to. Follow this rule: Quality in, quality out and leave the garbage behind.
Avoid fancy HTML. Heavy use of JavaScript, frames, iFrames, DHTML and Flash can hamper a spider's ability to crawl through a website.
Get listed. Submit websites in the proper DMOZ and Yahoo categories. Since these directories are human-verified, doing so will add relevancy to search engine results.
Avoid cloaking techniques. Cloaking is where a website presents one version to viewers and another to search engine spiders in order to trick search engine results, which violates the first guideline for providing substance. Using "robots.txt" files was mentioned earlier in this article to block pages with duplicate data, as was avoiding fancy HTML pages. Webmasters can use more complicated HTML pages to create pages available for search engines but can use "robots.txt" files to block the dynamic pages.
It could be as simple as using "robots.txt" to block all ".php" scripts. By blocking dynamic pages, webmasters can be sure that spiders will only index static pages. "Robots.txt" only affects spiders; site visitors will not be blocked from accessing any files labeled as such.
An example of this is developing at DVDsForABuck.com. Surfers can go to the "Studios and Actors" link at the bottom of the page to view DVD breakdowns, but the intent is to give spiders access to the movie descriptions without tripping them up in dynamically created " .php" pages.
DVDsForABuck's pages were created statically with HTML and provide a rich amount of text to be indexed. If a search engine result displays one of these links, a surfer can always find a link that takes them to the site.
This minor modification brought the site's Google page rank up two points after Google spiders indexed it with its new text links.
By following these guidelines, webmasters can organically create better search engine results and influence the relevancy of their websites.
Brandon "Fight the Patent" Shalton is the creator of T3Report.com and used the above techniques in his blog at FightThePatent.com during the heyday of Acacia.