Hello,
I'm a student in cybersecurity from France (that might explain the possible language mistakes in this message) and I'm trying to learn how to use zap.
I'm doing an internship in a company that develops a web application for company management (billing, budget management ...). My mission as an intern is to learn how to use zap and create an automated way to run scans.
So far, I managed to run zap on my test environment (test server with a Nextcloud running on it). I was able to run manual and automated tests (with automation framework). I even managed to make the authentication work, as the authentication was handled almost instantly by zap.
Now that I am on my company's test instance, I'm having troubles making it work. Their app is a quite modern app, which makes it harder than my Nextcloud to scan. I know that the login process is managed by a different service/server than the main app. From what I've heard, there are a bunch of redirections between the login page and the main app page. It doesn't seem to be a problem for zap's authentication test, but it might be one of the sources of my problems.
Here are the points that I have troubles with :
- Crawlers (spider and Ajax Spider) doesn't seem to scan everything on the page. I cannot see the entire website architecture in zap. It also seems to loop at some moments.
I know that the crawlers are not working well because if I run a manual scan, and I log in to the website manually, I still don't have the entirety of the website scanned by the crawlers. It's like every single tab in the menu are not clicked on by the crawlers.
- I am not sure authentication really works, but I don't really understand how I can check if it worked well, as it seems I have a problem with crawlers too. The authentication test works, but I'm not sure the session is kept alive if I run other tests after the "requestor" one in an Automation Framework.
- When the Ajax Spider is running in my AF, it opens a Chrome window and tries to log in again, but it fails as it enters the email address twice (or the email address is kept from the previous login, but I end up with 2 email addresses in the field, and it blocks the login process). Even if I enter the right credentials for it, the same window opens up a few seconds after the first one closed. Is my crawler login out again and again ?
Here is my (modified for privacy) .yaml file.
Any help is welcome, and yes, I'm probably doing it wrong, but I've been trying for more than 2 weeks to make it work, watched a lot of your videos and I feel like I cannot really go further without help.
Thanks in advance for anyone who will take some time to help me :)
---------------------------yaml configuration file--------------------------------
---
env:
contexts:
- name: "company_test_env"
urls:
- "
https://company_test_env_addresse.com"
- "
https://company_login_page_test_env.com"
excludePaths:
authentication:
method: "browser"
parameters:
loginPageUrl: "
https://company_login_page_test_env.com"
browserId: "chrome"
loginPageWait: 30
verification:
method: "poll"
loggedInRegex: "\\Q 200 OK\\E"
loggedOutRegex: "\\Q 302 Found\\E"
pollFrequency: 60
pollUnits: "requests"
pollUrl: "
https://company_test_env_addresse.com"
pollPostData: ""
sessionManagement:
method: "headers"
parameters:
Cookie: "cookies here (automatically generated by the requestor)"
technology:
exclude: []
include: []
users:
- name: "
myn...@company.com"
credentials:
password: "Super secret and unbreakable password"
username: "
myn...@company.com"
parameters:
failOnError: true
failOnWarning: false
progressToStdout: true
vars: {}
jobs:
- parameters:
scanOnlyInScope: true
enableTags: false
disableAllRules: false
rules: []
name: "passiveScan-config"
type: "passiveScan-config"
- parameters:
user: "
myn...@company.com"
requests:
- url: "
https://company_login_page_test_env.com"
name: ""
method: ""
httpVersion: ""
headers: []
data: ""
name: "requestor"
type: "requestor"
- parameters:
context: ""
user: "
myn...@company.com"
url: "
https://company_test_env_addresse.com"
maxDuration: 30
maxDepth: 5
maxChildren: 0
acceptCookies: false
handleODataParametersVisited: false
handleParameters: "IGNORE_COMPLETELY"
maxParseSizeBytes: 0
parseComments: false
parseGit: false
parseRobotsTxt: false
parseSitemapXml: false
parseSVNEntries: false
postForm: false
processForm: false
requestWaitTime: 0
sendRefererHeader: false
userAgent: ""
tests:
- onFail: "INFO"
statistic: "automation.spider.urls.added"
site: ""
operator: ">="
value: 100
name: "At least 100 URLs found"
type: "stats"
name: "spider"
type: "spider"
- parameters:
context: ""
user: "myname@company"
url: "
https://company_test_env_addresse.com"
maxDuration: 60
maxCrawlDepth: 10
numberOfBrowsers: 16
browserId: "chrome-headless"
maxCrawlStates: 0
eventWait: 1000
reloadWait: 1000
clickDefaultElems: true
clickElemsOnce: true
randomInputs: true
inScopeOnly: true
runOnlyIfModern: false
tests:
- onFail: "INFO"
statistic: "spiderAjax.urls.added"
site: ""
operator: ">="
value: 100
name: "At least 100 URLs found"
type: "stats"
name: "spiderAjax"
type: "spiderAjax"
- parameters:
maxDuration: 0
name: "passiveScan-wait"
type: "passiveScan-wait"
- parameters:
context: ""
user: "
https://company_test_env_addresse.com"
policy: ""
maxRuleDurationInMins: 0
maxScanDurationInMins: 0
addQueryParam: false
delayInMs: 0
handleAntiCSRFTokens: false
injectPluginIdInHeader: false
scanHeadersAllRequests: false
threadPerHost: 16
maxAlertsPerRule: 0
policyDefinition:
defaultStrength: "medium"
defaultThreshold: "medium"
rules: []
name: "activeScan"
type: "activeScan"
- parameters:
template: "risk-confidence-html"
theme: "original"
reportDir: "/home/myname/Documents/Zap_reports"
reportFile: ""
reportTitle: "ZAP Scanning Report"
reportDescription: ""
displayReport: false
risks:
- "info"
- "low"
- "medium"
- "high"
confidences:
- "falsepositive"
- "low"
- "medium"
- "high"
- "confirmed"
sections:
- "siteRiskCounts"
- "responseBody"
- "appendix"
- "alertTypes"
- "responseHeader"
- "alertTypeCounts"
- "riskConfidenceCounts"
- "alerts"
- "aboutThisReport"
- "contents"
- "requestBody"
- "reportDescription"
- "reportParameters"
- "requestHeader"
- "summaries"
name: "report"
type: "report"