I'm spidering an application and the spider suddenly got stuck on one of the URLs. There's nothing special about the URL (the spider had successfully passed through URLs which had the exact same structure).
The spidering process was on 85% when this happened, so I've paused the spider so I can try to resume the scan. On the console I can see that the following error had been triggered:
1832520 [ZAP-PassiveScanner] ERROR org.zaproxy.zap.ZAP$UncaughtExceptionLogger - Exception in thread "ZAP-PassiveScanner"
java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Arrays.copyOfRange(Arrays.java:3221)
at java.lang.String.<init>(String.java:233)
at java.lang.StringBuilder.toString(StringBuilder.java:447)
at org.zaproxy.zap.extension.params.ExtensionParams.setToString(Unknown Source)
at org.zaproxy.zap.extension.params.ExtensionParams.persist(Unknown Source)
at org.zaproxy.zap.extension.params.ExtensionParams.onHttpRequestSend(Unknown Source)
at org.zaproxy.zap.extension.params.ParamScanner.scanHttpRequestSend(Unknown Source)
at org.zaproxy.zap.extension.pscan.PassiveScanThread.run(Unknown Source)
After that, it shows it still crawled a couple of URLs and then:
1837706 [Thread-32] ERROR org.zaproxy.zap.ZAP$UncaughtExceptionLogger - Exception in thread "Thread-32"
java.lang.OutOfMemoryError: GC overhead limit exceeded
So I have two questions:
1 - Is it possible to resume the scan?
2- If not, if I stop the spider, save the session and restart ZAP will the spider crawl only after the point it already reached?
What's the best way to proceed? I'm currently keeping ZAP opened and the crawling paused.
Thanks,