dim hua
hua = lcase(Request.ServerVariables("HTTP_USER_AGENT"))
if instr(1,hua,"xenu")>0 then Response.Redirect (BASESITE & "/crawler.asp")
if instr(1,hua,"teleport")>0 then Response.Redirect (BASESITE &
"/crawler.asp")
if instr(1,hua,"msiecrawler")>0 then Response.Redirect (BASESITE &
"/crawler.asp")
if instr(1,hua,"webcopier")>0 then Response.Redirect (BASESITE &
"/crawler.asp")
if instr(1,hua,"openfind")>0 then Response.Redirect (BASESITE &
"/crawler.asp")
if instr(1,hua,"webzip")>0 then Response.Redirect (BASESITE &
"/crawler.asp")
if instr(1,hua,"bordermanager")>0 then Response.Redirect (BASESITE &
"/crawler.asp")
Has anyone found a better solution?
Matt K
Create a robots.txt file in your root directory with the following content:
User-Agent: *
Disallow: /
That should do the trick.
--
George Hester
"Jesper Nielsen" <j...@nielsenit.dk> wrote in message
news:k0_u7.1530$uQ.2...@news010.worldonline.dk...
No, simply place the file in the root of your website.
The first file serious spiders look for is a robots.txt file, and if it
finds that I is not allowed to spider the files, it will leave again,
without spidering/indexing anything.
/jesper/
We get several rogue spiders than can use over 40% of a server, often run by
people on ADSL lines.
Matt K
"Tom Pepper" <tomp...@mvps.org> wrote in message
news:Ooaoy2NTBHA.928@tkmsftngp03...