[ccache] Jonathan Parr presents www.libeldefense.com
noss1233 at gmail.com
Thu Nov 1 08:54:39 GMT 2007
Visitors find information in a dynamic site by using a search query. That
query can either be typed into a search form by the visitor or already be
coded into a link on the home page - making the link a pre-defined search of
the site's catalog. In that later case, the portion of the link containing
the search parameters is called a 'query string.'
But a search engine spider doesn't know to use your search function - or
what questions to ask. Dynamic scripts often need certain information before
they can return the page content: cookie data, session id, or a query string
are common requirements. Spiders usually stop indexing a dynamic site
because they can't answer the question.
If the spider does accidentally wander deeper into your site, it could
inadvertently get caught in a "spider trap": a badly written CGI script that
requests information the spider can't supply. Then the spider and your
server navigate a never-ending loop where a request for a page is met with a
request for information.
Getting a spider trapped inside your server is not just bad for the spider.
The repeated requests for pages can crash the server. If you share server
space with other Web sites and have a problem with site downtime, ask your
Web host to check for CGI script problems on other sites.
More information about the ccache