I am unable to crawl one of my sites. I have successfully setup, crawled and generated sitemaps for several sites. Using the same settings, I am unable to get one of my sites to crawl. In the aborted URL list, I get an error,
HTTP-Error 999 Page too large, check settings
Anyone have any ideas on how I might resolve this?
Any assistance is much appreciated
--
You received this message because you are subscribed to the Google Groups "SOFTplus GSiteCrawler" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gsitecrawler...@googlegroups.com.
To post to this group, send email to gsitec...@googlegroups.com.
Visit this group at http://groups.google.com/group/gsitecrawler?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.