Hey all, I've been thinking about writting my own frontend to scrapd for a wile, but I have to take it one step at a time. Pushing scrapy usage at work was already a great victory.
I don't think any extra work on the server side is necessary. Scrapyd's JSON API should be enough. What we need is to serve only static html, js and css files and build a frontend using ajax calls only.
Could be usefull to see be able to upload spiders and to schedule job. The JSON API already supports this. It's just a matter of building a JS GUI around it.
Put together Twitter bootsrap and some JS frameowrk with data binding to minimize boiler plate work. There are quite many options these daus: Angular, backbone, spine, knockout, etc.
On a related note, the possibility to accept AJAX calls from other hosts would be desirable. This can be achieved using CORS headers, it is really just add a line to scrapyd's HTTP responses.
More info on CORS headers:
I have never used twisted.web, but serving static content appears to be trivial: