Hi Folks,
I have lots of URLs and HAR format files generated by a crawler.
URL list
--------
a.com/a-pathb.com/a-path?with_query=stringsubdomain.c.com/another/url/a.php....
HAR files
---------------
har_file_for_a.com_a_path
har_file_for_b.com_a_path
har_file_for_subdomain.c.com_another_url_a_php
I'm using python for my projects with docker environment.
My crawler generating a few hundred URLs and more HAR files (some URLs have multiple input) per day.
I'm trying to scan URLs in the list with giving HAR files for only a few vulnerabilities. (har files can contain get or post data)
I researched zaproxy and found out with three options:
1. Using zap-cli (
https://github.com/Grunny/zap-cli) : It looks like quick-scan is ok, but there is no option to give har file.
2. Using zap-api-python (
https://github.com/zaproxy/zap-api-python), I tested this project. But this it has the same problem as zap-cli, no har file option.
3. Using raw zap-api: I have no idea to start with.
Could you please help me to come up with this problem?
- What is the best method to achieve this?
- Am I heading in the right direction?
- How should I use zap-api? (which requests should I make?
Thank you,
Angus