In <npmDCwt59....@netcom.com>, n...@netcom.com wrote:robots.txt is a defacto standard file for webwalkers/spiders/etc to
> Upon close examination of my log I can see there's some midnight maurader
> (of the programatic kind) going around and looking for "robot.txt".
> Just what exactly is it looking for in that robot.txt file? I'd like to
> feed the little devils.
look for on web sites to determine what to index on that site (or
whether to index the site at all).
The file looks like:
#This is a file retrieved by webwalkers a.k.a. spiders that
See <URL:http://web.nexor.co.uk/mak/doc/robots/norobots.html> for more
You must Sign in before you can post messages.
To post a message you must first join this group.
Please update your nickname on the subscription settings page before posting.
You do not have the permission required to post.