Hi,
when running my scraper on scrapinghub I get that all the requests are registered as failed and the following errors are reported.
```
Traceback (most recent call last):
File "/app/python/lib/python2.7/site-packages/twisted/internet/defer.py", line 149, in maybeDeferred
result = f(*args, **kw)
File "/usr/local/lib/python2.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/extension.py", line 39, in item_scraped
item = self.exporter.export_item(item)
File "/app/python/lib/python2.7/site-packages/scrapy/exporters.py", line 304, in export_item
result = dict(self._get_serialized_fields(item))
File "/app/python/lib/python2.7/site-packages/scrapy/exporters.py", line 65, in _get_serialized_fields
field_iter = six.iterkeys(item)
File "/usr/local/lib/python2.7/site-packages/six.py", line 593, in iterkeys
return d.iterkeys(**kw)
AttributeError: 'NoneType' object has no attribute 'iterkeys'
```
From the scraping hub dashboard I can see that all items appears as failed (0 processed), however the scraper seems to work fine and serielizes the correct data in the database.
Anybody can explain this?