Hi Amarildo,
I looked up the error reported online, and found this information:
The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown. This feature is designed to prevent applications from running for an extended period of time while making little or no progress because the heap is too small. If necessary, this feature can be disabled by adding the option -XX:-UseGCOverheadLimit to the command line.
That said, while the PDF you were generating is quite large, if 98% of the processing time is spent in garbage collection, then it sounds like there is further optimization we might be able to do with the code in the future to prevent this kind of error. I have filed the following issue ticket to track the issue for future analysis:
In the meantime, I'm not sure what to suggest. If possible you can try adding more memory to your deployment, and increase the java heap space size, to see if that helps.
You could try adding the -XX:-UseGCOverheadLimit option to your JVM startup parameters (e.g. typically in ~/.bashrc in the $JAVA_OPTS variable), but as the articles above note, this may not resolve the issue - it may instead just exhaust the memory and return a general OutofMemory error.
If your institution is interested in sponsoring the analysis and code optimizations described on issue #
13133 so we can prioritize it for inclusion in an upcoming release, please feel free to contact me off-list. Otherwise, we will try to address it at some point when we have available resources.
Regards,