Finding aid report - java.lang.OutOfMemoryError

Skip to first unread message

amarildo rod

Aug 2, 2019, 9:08:04 AM8/2/19
to AtoM Users
Hello, when I generate the Finding aid report the following error happens:

"Exception in thread" main "java.lang.OutOfMemoryError: GC overhead limit exceeded"

The file “php3wjUT2” has 143MB and approximately 1761547 lines.

"arFindingAidJob": Running: java -jar '/var/www/html/atom/atom-2.4.1/lib/task/pdf/saxon9he.jar' -s: '/ tmp / phpnAm9Fl' -xsl: '/ var /www/html/atom/atom-2.4.1/lib/task/pdf/ead-pdf-inventory-summary.xsl '-o:' / tmp / php3wjUT2 '2> & 1

"arFindingAidJob": Running: fop -r -q -fo '/ tmp / php3wjUT2' -pdf '/var/www/html/atom/atom-2.4.1/downloads/legislative-language-assembly -general-3.pdf '2> & 1

"arFindingAidJob": Converting the EAD FO to PDF has failed.

I split the php3wjUT2 file into 3 parts of about 50MB and ran the command line below and the pdf was generated successfully.

fop -r -q -fo '/ tmp / php3wjUT2' -pdf '/var/www/html/atom/atom-2.4.1/downloads/legislative-assembly-of-blade-general-3.pdf '2> & 1

Server Settings:
Linux Centos
Memory: 16GB
CPU (s): 4 - MHz: 2600.000
Free Disk: 34G

Are there any adjustments I can make to generate the report without splitting it?

Thank you.

Dan Gillean

Aug 2, 2019, 12:09:31 PM8/2/19
to ICA-AtoM Users
Hi Amarildo, 

I looked up the error reported online, and found this information: 

The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown. This feature is designed to prevent applications from running for an extended period of time while making little or no progress because the heap is too small. If necessary, this feature can be disabled by adding the option -XX:-UseGCOverheadLimit to the command line.

That said, while the PDF you were generating is quite large, if 98% of the processing time is spent in garbage collection, then it sounds like there is further optimization we might be able to do with the code in the future to prevent this kind of error. I  have filed the following issue ticket to track the issue for future analysis: 
In the meantime, I'm not sure what to suggest. If possible you can try adding more memory to your deployment, and increase the java heap space size, to see if that helps. 

You could try adding the -XX:-UseGCOverheadLimit option to your JVM startup parameters (e.g. typically in ~/.bashrc in the $JAVA_OPTS variable), but as the articles above note, this may not resolve the issue - it may instead just exhaust the memory and return a general OutofMemory error. 

If your institution is interested in sponsoring the analysis and code optimizations described on issue #13133 so we can prioritize it for inclusion in an upcoming release, please feel free to contact me off-list. Otherwise, we will try to address it at some point when we have available resources. 


Dan Gillean, MAS, MLIS
AtoM Program Manager
Artefactual Systems, Inc.

You received this message because you are subscribed to the Google Groups "AtoM Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
To view this discussion on the web visit

amarildo rod

Aug 2, 2019, 12:22:05 PM8/2/19
to AtoM Users
Thanks Dan.
To unsubscribe from this group and stop receiving emails from it, send an email to
Reply all
Reply to author
0 new messages