How would scrapy handle recursive call?

49 views
Skip to first unread message

Mehdi Nazari

unread,
May 20, 2015, 2:52:12 PM5/20/15
to scrapy...@googlegroups.com
Hi All, 

Been a while I'm working on a project based off scrapy. I happen to come at a point where I need to make a recursive call originating from the same parse function.
here is the code I'm trying to get to work. 

def parse_search_result(self, response):
  try:
    # import pdb; pdb.set_trace()
    next_page = response.xpath("//div[@class='align_center']//node()[following::span and not(@class='noborder')]/@href").extract()[0]
    next_page = str(next_page).translate(None, delete)
    next_page_url = '{0}{1}'.format(self.base_url[0], next_page)
    yield FormRequest(next_page_url, method="GET", callback = self.parse_search_result)
  except:
    pass

  yield FormRequest(response.url, method="GET", callback = self.parse_applications)
I don know if I am looking at this piece of logic in a correct way?

Daniel Fockler

unread,
May 21, 2015, 4:33:20 PM5/21/15
to scrapy...@googlegroups.com
With recursion you need a base case where your function will not call the function and recurse again. So at some condition you will want to yield an Item instead of another FormRequest.
Reply all
Reply to author
Forward
0 new messages