Hey Jim,
it still seems unintuitive that you need to go through http requests
when you have full access to everything. Have you looked at Drupal's
static generator?
However, if making an HTTP request is your only (simple) way of
generating the page that you want in the failover, Scrapy might indeed
be an option. If you know (i.e. can generate a list of) all your URLs
you could simply put them in a Spider's `start_urls` or
`start_requests()`, and I would prefer that over the requests library
because you get Scrapy's throttling, error handling, etc. If the URLs
are unknown, you can make use of CrawlSpider and spider rules.
Cheers,
-Jakob
> <mailto:
scrapy-users...@googlegroups.com>.
> <mailto:
scrapy...@googlegroups.com>.
> <mailto:
scrapy-users...@googlegroups.com>.
> <mailto:
scrapy...@googlegroups.com>.
> You received this message because you are subscribed to a topic in the
> Google Groups "scrapy-users" group.
> To unsubscribe from this topic, visit
>
https://groups.google.com/d/topic/scrapy-users/lmmJAIT42NI/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
>
scrapy-users...@googlegroups.com
> <mailto:
scrapy-users...@googlegroups.com>.
> <mailto:
scrapy...@googlegroups.com>.