Running a spider without active scan afterwards on command line

283 views
Skip to first unread message

Tim Krax

unread,
Aug 3, 2022, 12:34:28 PM8/3/22
to OWASP ZAP User Group
Hello all,
I want to run a spider from the command-line to get all the URLs of a website without running an active scan afterwards (as it takes around 40 hours), but I am not able to figure out how. 

I have the following configuration:
OS: Ubuntu 20.04.1 x86_64
Java version 18.0.2 (openJDK)
Available memory: 64311 MB
Using JVM args: -Xmx32155m
ZAP: 2.11.1, all add-ons installed

I used the command:
./zap.sh -cmd -quickurl https://qs.example.com/ -quickprogress -quickout logs/qs_URL.xml

The spider is finishing after a few minutes, but the active scan crashed after 40h, because the data file was getting too big.
I guess because of the crash, the file I defined with "-quickout" was also not written.

Could someone give me a helping hand and tell me how I can only run the spider on the command-line?

Thanks a lot in advanced
Tim 


kingthorin+owaspzap

unread,
Aug 3, 2022, 3:25:36 PM8/3/22
to OWASP ZAP User Group
As far as I know there isn't a way to do that from the cmd/bash. You'd have to use the automation framework or actually interact with the API.

Maybe there's an option with zap cli, but it's third party, so you'd have to check their docs.

Simon Bennetts

unread,
Aug 4, 2022, 3:24:23 AM8/4/22
to OWASP ZAP User Group
I'd recommend using the Automation Framework - you can run this from the command line and it gives you the level of control you're looking for: https://www.zaproxy.org/docs/automate/automation-framework/

Cheers,

Simon

Tim Krax

unread,
Aug 4, 2022, 4:42:05 PM8/4/22
to OWASP ZAP User Group
Thanks a lot for your answers, I was able to set it up with the automation framework.

Cheers
Tim

Simon Bennetts

unread,
Aug 5, 2022, 2:42:53 AM8/5/22
to OWASP ZAP User Group
Great - thanks for letting us know!
Reply all
Reply to author
Forward
0 new messages