In the simplest terms, search engines collect data about a unique website by sending an electronic spider to visit the site and copy its content which is stored in the search engine’s database.
Generally known as ‘bots’, these spiders are designed to follow links from one document to the next. As they copy and assimilate content from one document, they record links and send other bots to make copies of content on those linked documents. This process continues ad infinitum. By sending out spiders and collecting information 24/7, the major search engines have established databases that measure their size in the tens of billions. Every day, both Yahoo and Google claim to spider as much data as is contained in the US Library of Congress (approx. 150 million items).
Good content is the most important aspect of search engine optimization.
The easiest and most basic rule of the trade is that search engine spiders can be relied upon to read basic body text 100% of the time. By providing a search engine spider with basic text content, SEOs offer the engines information in the easiest format for them to read. While some search engines can strip text and link content from Flash files, nothing beats basic body text when it comes to providing information to the spiders. Very good SEOs can almost always find a way to work basic body text into a site without compromising the designer’s intended look, feel and functionality.