Hi all,
I am using OWASP Zap to test my application and have been for a number of years, more recently moving to the automation framework which has been great.
My application is a fairly old MVC application but is still under active development, and the technology used may be at the heart of the issue here. The main problem is that the spider/ajax spider can find endpoints rather well, but can't read the form or otherwise know what to submit, so the combination of spiders + ajax scan is fairly limited, as all attempts to attack a form are knocked back on validation.
The solution has been to use an existing session - navigate though the site, and submit forms manually. Then all the successful form POSTs are saved in the Context, and ZAP is very good at taking those successful POSTs and modifying them as part of the active scan.
With the automation framework this has worked well, we just need to execute them with:
./zap.sh -cmd -autorun "myAttackPlan.yaml" -session "my.session"
The problem is that the session we use becomes stale. Any new forms aren't included, and anything found with the spider last time isn't included next time. The best solution we have at the moment is to open the saved session and manually navigate a bit more when we have new content, and then remember to save and make a copy (if the session isn't copied before running the spider/active scan on it, it is effectively ruined as it can become huge, it seems to store the response of every attack).
Is there a better way to re-use the form data from one test run to the next? Ideally we would like to export the form data and urls for use next time which we can then build on, without using the entire session which can be 30+gb in size after an active scan.