Re: How can use a pipeline in Scrapy when using a spider through the parse command?

85 views
Skip to first unread message

Rolando Espinoza La Fuente

unread,
Mar 28, 2013, 1:52:48 PM3/28/13
to scrapy...@googlegroups.com
Hi, you can try my branch
https://github.com/darkrho/scrapy/tree/crawl-cmd-callback

It adds the capability to the `crawl` command to specify an url and a callback.
The specific changes are
https://github.com/darkrho/scrapy/commit/92d443c7965263f0fa6cd54d9d8446e0c244e059

And you use it like this:

scrapy crawl stackoverflow
http://stackoverflow.com/users/1194900/jeff70227 -c parse_profile

I found this really useful to test the scraping code in spiders that
perform multi-level crawling.

Rolando

On Wed, Mar 27, 2013 at 10:49 PM, Gabriel Perez <gabri...@gmail.com> wrote:
>> scrapy parse --spider=$spider -c parse_thread -d 100 --nolog $url
>
>
> Is there a way to enable pipelines for items processed using the parse
> command?
>
> --
> You received this message because you are subscribed to the Google Groups
> "scrapy-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to scrapy-users...@googlegroups.com.
> To post to this group, send email to scrapy...@googlegroups.com.
> Visit this group at http://groups.google.com/group/scrapy-users?hl=en.
> For more options, visit https://groups.google.com/groups/opt_out.
>
>

Pablo Hoffman

unread,
Apr 12, 2013, 7:27:15 AM4/12/13
to scrapy...@googlegroups.com
There is now, in the latest dev version.

Pull request 284 added a --pipelines argument to the parse command.
Reply all
Reply to author
Forward
0 new messages