#!/usr/bin/env python
import time
import os
import subprocess
from zapv2 import ZAPv2
# Start ZAP
subprocess.Popen(['/path/to/zap.sh', '-daemon'], stdout=open(os.devnull, 'w'))
time.sleep(10)
zap = ZAPv2()
# Spider and scan...
print 'Session will be saved to: %s' % zap.core.home_directory
zap.core.save_session('Session Name')
# or if you want to save to other directory than "home directory"
#zap.core.save_session('/path/to/dir/Session Name')
# Shutdown ZAP
zap.core.shutdown
After setting a Session as active, all the subsequent requests sent by/through ZAP are modified to have their headers modified so that they match that particular session. So far, only Cookie based authentication is supported for this process. And all the before mentioned steps are available via API as well so they can be easily scripted.
We are currently working on a new set of features that will allow ZAP users to define Users and Roles in a more consistent and easy way so actions such as "Spidering" (and many others) can be run from the point of view of an User. But this is just in its inception stage, so in the meanwhile you should follow the steps above.
--
You received this message because you are subscribed to the Google Groups "OWASP ZAP User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to zaproxy-user...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
target = 'http://lee.beiservices.com/'
zap.urlopen(target)
# Access other pages
zap.urlopen(target + 'cgi01')
zap.urlopen(target + 'cgi02')
zap.urlopen(target + 'cgi03')
# Give the sites tree a chance to get updated
time.sleep(2)
print 'Spidering target %s' % target
zap.spider.scan(target)
When I use the GUI in conjunction with Firefox and manually navigate the site, all of the various cgi scripts (pages) show up under the lee.beiservices.com site but they do not show up for a quick start or a regular spider.
One other interesting thing I did notice was there were no POST requests in the automated scan just GET. Perhaps this will change when I add those urls.
Is there a status for saving a session similar to the status used for the spider and scan process? Perhaps that would help me to know when the session is successfully saved if it exists. Thanks
Is there a syntax I can use to add post data to a url and add that to my openurl statements?
post_data = {'parameter1': 'value1', 'parameter2': 'value2'}
zap.urlopen("http://localhost/", urllib.urlencode(post_data))
I am executing zap by opening a terminal, navigating to the /opt/zap/ZAP_D-2013-04-29 (into which I have placed the new zap.sh file you attached) then at the command line as a regular user I type sh zap.sh. When I use ZAP_2.1.0 with the old zap.sh this opens the GUI normally but when I follow the same steps using ZAP_D-2013-04-29 and the updated zap.sh you sent, this is where I am getting the java error.
Under Spider in Options, POST forms is already checked so it should be submitting them. And indeed when I do a scan via the GUI the POST requests appear right alongside the GET requests. So far it's just the session generated by the Automated Python script that do not have POST requests. I specifically added POST requests per your instructions and still none appear in the automatic scan.
When starting ZAP via the python script using:
subprocess.Popen(['/opt/zap/ZAP_2.1.0/zap.sh','-daemon'],stdout=open(os.devnull,'w'))
the GUI is not opened (I assume that is because of -daemon).
Are the current settings of ZAP still available in daemon mode or are there additional settings I need to set via the script?
Does the GUI need to be fully opened for the script generated scan to fully work?
What I tried now is opening the GUI then opening Firefox and did a manual walk through of one of the sections of our site on my local machine. Then I ran an active scan on that. Then I went through the Sites pane and made a list of all of the urls in this pane and copy them into my python script then ran the script to do this automatically. The automatic scan generated by the python script did take much longer than any other automatic scan to this point, though not quite as long as the active scan done by via the GUI. And I saved the session via the script. When I opened the automatic session and compared it to the GUI generated session. The GUI generated session showed considerably more Alerts including 7 High Warnings that do not show up on the Python Script generated scan. I am getting closer but I fear I don't see where the difference between the two scans (using the same urls) comes from.