Generally, websites that use a third party service to render some data visualization (map, table, etc) have to send the data somehow, and in most cases this data is accessible from the browser.
So, basically you have there all the data you want in a nice json format ready for consuming.
Scrapy provides the "shell" command which is very convenient to thinker with the website before writing the spider:
2013-09-27 00:44:14-0400 [scrapy] INFO: Scrapy 0.16.5 started (bot: scrapybot)
...
In [1]: from scrapy.http import FormRequest
In [3]: fetch(req)
...
In [4]: import json
In [5]: data = json.loads(response.body)
In [6]: len(data['stores']['listing'])
Out[6]: 127
In [7]: data['stores']['listing'][0]
Out[7]:
{u'address': u'678A Woodlands Avenue 6<br/>#01-05<br/>Singapore 731678',
u'city': u'Singapore',
u'id': 78,
u'lat': u'1.440409',
u'lon': u'103.801489',
u'name': u"McDonald's Admiralty",
u'op_hours': u'24 hours<br>\r\nDessert Kiosk: 0900-0100',
u'phone': u'68940513',
u'region': u'north',
u'type': [u'24hrs', u'dessert_kiosk'],
u'zip': u'731678'}
In short: in your spider you have to return the FormRequest(...) above, then in the callback load the json object from the response's body and finally for each store in the listing create the an item with the values.
Hope that helps.
Regards,
Rolando