ZAP Desktop

76 views
Skip to first unread message

Steve Williams

unread,
Jun 12, 2023, 3:47:43 PM6/12/23
to OWASP ZAP User Group
Hello. Is there any explanation on how to populate the "sites" section within a section?Xnip2023-06-12_21-41-32.png
I have observed that this section remains empty until I log into a web application in a browser that utilizes ZAP as a proxy. The more URLs I manually navigate through, the more URLs the spider tool discovers. Am I understanding correctly that in order to set up Jenkins CI with ZAP integrated, I need to visit as many endpoints and other links as possible beforehand, because otherwise, the Spider won't even discover them? Apologies if this question seems trivial. Thank you.

psiinon

unread,
Jun 13, 2023, 4:29:43 AM6/13/23
to zaprox...@googlegroups.com
The Sites tree is a hierarchic representation of the HTTP(S) requests and responses that have been generated by or proxied through ZAP.

There are lots of ways to populate the sites tree:
  • Manually exploring while proxying through ZAP - very effective, but no good for automation
  • Proxying tests through ZAP - great if you have them
  • Using the traditional spider - very effective for traditional apps
  • Using the AJAX spider - better for "modern" apps, ie ones which make heavy use of JavaScript
  • Importing API definitions (like OpenAPI, GraphQL, SOAL) - effective if you have them
  • Importing logs

Exploring the target app as effectively as possible is key to being able to attack it effectively.

Cheers,

Simon

On Mon, Jun 12, 2023 at 8:47 PM Steve Williams <653...@gmail.com> wrote:
Hello. Is there any explanation on how to populate the "sites" section within a section?Xnip2023-06-12_21-41-32.png
I have observed that this section remains empty until I log into a web application in a browser that utilizes ZAP as a proxy. The more URLs I manually navigate through, the more URLs the spider tool discovers. Am I understanding correctly that in order to set up Jenkins CI with ZAP integrated, I need to visit as many endpoints and other links as possible beforehand, because otherwise, the Spider won't even discover them? Apologies if this question seems trivial. Thank you.

--
You received this message because you are subscribed to the Google Groups "OWASP ZAP User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to zaproxy-user...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zaproxy-users/7a7d0e3c-d490-4cd3-8308-c0d26f250019n%40googlegroups.com.


--
OWASP ZAP Project leader

Steve Williams

unread,
Jun 13, 2023, 5:31:05 AM6/13/23
to OWASP ZAP User Group
I have exhaustively clicked on every possible element in my web application, resulting in over 300 URLs for potential attack. However, it appears that when using automation from the command line, I cannot utilize the same set of URLs in the sites tree. The spider only recognizes a single URL. Please advise.
вторник, 13 июня 2023 г. в 10:29:43 UTC+2, psi...@gmail.com:

Steve Williams

unread,
Jun 13, 2023, 6:12:54 AM6/13/23
to OWASP ZAP User Group
Is it accurate to say that the only method to scan all the endpoints in Automation performing from the command line is by manually adding them to the context? Or the session with its site tree is available somehow from the cmdline?
вторник, 13 июня 2023 г. в 11:31:05 UTC+2, Steve Williams:

psiinon

unread,
Jun 13, 2023, 6:43:11 AM6/13/23
to zaprox...@googlegroups.com
No - the only option that wont work in automation is the manual proxying option.
The usual approach is to use one or both of the spiders, and to import any API definitions available.

Message has been deleted

Steve Williams

unread,
Jun 13, 2023, 8:10:13 AM6/13/23
to OWASP ZAP User Group
I don't understand the reason why when I start ZAP with cmdline like 

 .\zap.bat -cmd -autorun .\zap_plan.yaml 

spider can only found 1 URL, when I use the same plan in Desktop, Spider can see 300+ URLs. What am I doing wrong? Thanks.


вторник, 13 июня 2023 г. в 11:31:05 UTC+2, Steve Williams:
I have exhaustively clicked on every possible element in my web application, resulting in over 300 URLs for potential attack. However, it appears that when using automation from the command line, I cannot utilize the same set of URLs in the sites tree. The spider only recognizes a single URL. Please advise.

Steve Williams

unread,
Jun 13, 2023, 8:46:58 AM6/13/23
to OWASP ZAP User Group
It appears that I have identified the reason for the discrepancy. I added the "-session" key, and now the reports appear to be similar.
Thanks.

вторник, 13 июня 2023 г. в 14:10:13 UTC+2, Steve Williams:

psiinon

unread,
Jun 13, 2023, 8:51:06 AM6/13/23
to zaprox...@googlegroups.com
Thanks for letting us know!

Illia

unread,
Sep 13, 2023, 8:24:34 AM9/13/23
to ZAP User Group
Hi Steve,

I am trying to run ZAP by using Automation Framework too and I am using the same parameters as you (-cmd, -session, -autorun File.yaml)

Can I know if the scan is showing consistent output everytime you run it via command line? Because for my case, I notice that my website's audit log is sometimes showing all modules scanned and other times only showing login modules. 
My command is like this: .\zap.bat -cmd -config network.connection.timeoutInSecs=180 -config rules.domxss.browserid=chrome-headless -autorun File.yaml -session "path\to\session"

I used back the same session file which I have configured the context and authentication for the .yaml file. Then I specify import URL job, active scan job, and report job in my .yaml file. 

May I know if you are doing the same thing?
Reply all
Reply to author
Forward
0 new messages