In scrapy for example if i had two URL's that contains different HTML. Now i want to write two individual spiders each for one and want to run both the spiders at once. In scrapy is it possible to run multiple spiders at once.
In scrapy after writing multiple spiders, how can we schedule them to run for every 6 hours(May be like cron jobs)
I had no idea of above , can u suggest me how to perform the above things with an example.
Thanks in advance.
The answer for 1 would be:
1. you write the spiders (in the development environment of your choice)
2. you test and debug the spiders with "scrapy crawl"
3. once the spiders are ready you deploy them in a Scrapyd server
4. you use Scrapyd schedule.json API to schedule 2 spider runs, one
for each urls, using spider arguments to pass the url.
Pablo.
--
You received this message because you are subscribed to the Google Groups "scrapy-users" group.
To view this discussion on the web visit https://groups.google.com/d/msg/scrapy-users/-/N4Zb7iVUly0J.