Hi.
The current spider does support cookies through the option "Edit" > "Enable Session Tracking (Cookie)", it has issues though (like Issue 15 [1]).
You can try to spider and scan (the issues can prevent them to work properly), logged in, with the following steps:
- Open ZAP and select the option "Edit" > "Enable Session Tracking (Cookie)";
- Disable the cookies in your browser as ZAP will handle them and configure the browser to use ZAP;
- Login;
- (Step to try to prevent the issues) Delete all the nodes of your site node from the "Sites" tree, except the one that doesn't have a cookie in the request (should be the root node of your site);
- Run the spider (you probably need to configure ZAP to ignore the logout URL);
- Disable the option "Edit" > "Enable Session Tracking (Cookie)" (if it's not disabled the cookies will be sent twice (or more) and that can lead to Issue 15);
- Run the active scan.
I hope it helps.
[1]
https://code.google.com/p/zaproxy/issues/detail?id=15Best regards.
On Tuesday, June 26, 2012 11:37:15 AM UTC+1, Cosmin Stefan wrote:
Hello Mike,
The issue you are mentioning is a well-known problem for the current crawler and there is nothing you can do about it for now.
Currently, the crawler is going through a major transformation, being rewritten almost from scratch and the new version will include support for cookies and an adequate support for user sessions. A new version which you could use, including these features, will be available in a couple of weeks in the trunk of the repository (or maybe even sooner).
Thanks and have a great day,
Cosmin
On Mon, Jun 25, 2012 at 11:33 PM, Mike Ruhlin
<mi...@ruhl.in> wrote:
Hi guys,
My site uses cookie-based authentication and has at least a few reflected XSS vulnerabilities which only exist for logged in users. As such, I need to configure the ZAP spider and active scanner to attack all my pages with and without proper login cookies.
It looks like the http proxy will save cookie information and replay it on subsequent attacks against those URLs, but the spider crawls with no cookies; so of course when I attack them, the active scanner doesn't have any cookies to send to those pages. Is there any way to set the spider to always send a given cookie?
Thanks!