Run multiple spiders via scrapyd

220 views
Skip to first unread message

gulan

unread,
Sep 26, 2013, 6:35:04 AM9/26/13
to scrapy...@googlegroups.com
Hi, 

First of all I really love scrapy, it's powerful and easy to understand even if you, like me, have very little experience with Python!  

But I have this problem after upgrading to scrapy 0.18

For scrapy 0.14.4 I used the following custom command to run multiple (40+) spiders deployed via scrapyd: 


from scrapy.command import ScrapyCommand
import urllib
import urllib2
from scrapy import log

class AllCrawlCommand(ScrapyCommand):

requires_project = True
default_settings = {'LOG_ENABLED': False}

def short_desc(self):
return "Schedule a run for all available spiders"

def run(self, args, opts):
for s in self.crawler.spiders.list():
values = {'project' : 'myproject', 'spider' : s}
data = urllib.urlencode(values)
req = urllib2.Request(url, data)
response = urllib2.urlopen(req)
log.msg(response)


I then just typed scrapy allcrawl, and all spiders launched one after the other. Very simple, veray neat.


I recently installed scrapy 1,8 on debian squeeze, and when I try to run this command I get:

/usr/local/lib/python2.6/dist-packages/Scrapy-0.18.2-py2.6.egg/scrapy/command.py:34: ScrapyDeprecationWarning: Command's default `crawler` is deprecated and will be removed. Use `create_crawler` method to instatiate crawlers.
  ScrapyDeprecationWarning)

I haven't been able to figure out what this means, really, and basically I want to know 

a) Is there an easy fix to this, ie, can I use this custom command but with some alteration (use 'create_crawler' method in the command perhaps?) to run multiple spiders.
b) Or should I use a whole different approach to achieve this in scrapy 0.18

Thanks!

/Gustav








Duy Nguyen

unread,
Nov 22, 2013, 4:14:38 PM11/22/13
to scrapy...@googlegroups.com
Use:
    crawler = self.crawler_process.create_crawler()

Reply all
Reply to author
Forward
0 new messages