I would like to act on all items, in other word s collect all items and wirte them once to a file adding a header wrapping all the items. A good place for this seem to bee the pipelines, here only item by item is handeled.
I fond this solution: „How to access all scraped items in Scrapy item pipeline?“ https://stackoverflow.com/questions/12768247/how-to-access-all-scraped-items-in-scrapy-item-pipeline
But the way seems to be more complex than nessesary. Is there a smarter, shorter, easier or more elegant way? THX!
class MyPipeline(object):
def __init__(self):
self.items = []
def process_item(self, item, spider):
self.items.append(item)
return item
def close_spider(self, spider):
# Save self.items