Spider seems to be stuck: OutOfMemory

208 views
Skip to first unread message

Er Galvão Abbott

unread,
Jul 18, 2012, 4:35:24 AM7/18/12
to zaprox...@googlegroups.com
I'm spidering an application and the spider suddenly got stuck on one of the URLs. There's nothing special about the URL (the spider had successfully passed through URLs which had the exact same structure).

The spidering process was on 85% when this happened, so I've paused the spider so I can try to resume the scan. On the console I can see that the following error had been triggered:

1832520 [ZAP-PassiveScanner] ERROR org.zaproxy.zap.ZAP$UncaughtExceptionLogger  - Exception in thread "ZAP-PassiveScanner"
java.lang.OutOfMemoryError: GC overhead limit exceeded
        at java.util.Arrays.copyOfRange(Arrays.java:3221)
        at java.lang.String.<init>(String.java:233)
        at java.lang.StringBuilder.toString(StringBuilder.java:447)
        at org.zaproxy.zap.extension.params.ExtensionParams.setToString(Unknown Source)
        at org.zaproxy.zap.extension.params.ExtensionParams.persist(Unknown Source)
        at org.zaproxy.zap.extension.params.ExtensionParams.onHttpRequestSend(Unknown Source)
        at org.zaproxy.zap.extension.params.ParamScanner.scanHttpRequestSend(Unknown Source)
        at org.zaproxy.zap.extension.pscan.PassiveScanThread.run(Unknown Source)

After that, it shows it still crawled a couple of URLs and then:

1837706 [Thread-32] ERROR org.zaproxy.zap.ZAP$UncaughtExceptionLogger  - Exception in thread "Thread-32"
java.lang.OutOfMemoryError: GC overhead limit exceeded

So I have two questions: 

1 - Is it possible to resume the scan?
2- If not, if I stop the spider, save the session and restart ZAP will the spider crawl only after the point it already reached?

What's the best way to proceed? I'm currently keeping ZAP opened and the crawling paused.

Thanks,

Simon Bennetts

unread,
Jul 18, 2012, 4:52:52 AM7/18/12
to zaprox...@googlegroups.com
Hi,

What OS are you using?

You should save the session (via the File menu or toolbar button) first.
But you will need to stop and restart ZAP with more memory.

You'll be able to open the saved session, but you'll have to start the spider again.

Many thanks.

Simon
Message has been deleted

Yuki Imoto

unread,
Mar 2, 2017, 8:12:11 PM3/2/17
to OWASP ZAP User Group
Please let me have related question.

I'm also spidering an application and the spider suddenly got stuck on the number of about 370,000 URLs. (Because this application has large URL, does the spider got stuck?)

The spidering process continued to stay on only 1% when this happened, so I've no choice but to stop Owasp service. On the console I can see that the following error had been triggered:

Exception:java.lang.Out Of MemoryError thrown from the UncaughtExceptionHandler in thread "ZAP-ProxyServer"
Java.lang.OutOfMemoryError:Java heap space


So I have two questions:

1 - Is it possible to resume the scan to the last page
2- If not, should I restrict current domain(not spider all pages)

What's the best way to proceed?

Thank you,

Reply all
Reply to author
Forward
0 new messages