in.read() results in java.lang.OutOfMemory for more than 10 000 lines in the same group

104 views
Skip to first unread message

Steven Lequient

unread,
Jul 26, 2018, 10:27:42 AM7/26/18
to beanio-users
Hi,

Apparently, the JVM can't stand the result of in.read when parsing groups of about 10 000 lines.

I'm currently trying to parse a test file, but the objective is to parse invoices, which may have such huge groups.
Is there any way to circumvent this?

Here are my test config file, and a test data file.

Also, here is the top part of the error, but to summarize, the JVM is out of memory and dumping everything.

Thank you very much in advance for your help.

---------




JVMDUMP039I Traitement de l'�v�nement de vidage "systhrow", d�tails "java/lang/OutOfMemoryError" � 2018/07/26 16:20:19 - Veuillez patienter.
JVMDUMP032I La machine virtuelle Java a demand� un vidage System en utilisant 'C:\PFD\Data\git\eFM\core.20180726.162019.376.0001.dmp' en r�ponse � un �v�nement
JVMDUMP010I Vidage System �crit dans C:\PFD\Data\git\eFM\core.20180726.162019.376.0001.dmp
JVMDUMP032I La machine virtuelle Java a demand� un vidage Heap en utilisant 'C:\PFD\Data\git\eFM\heapdump.20180726.162019.376.0002.phd' en r�ponse � un �v�nement
JVMDUMP010I Vidage Heap �crit dans C:\PFD\Data\git\eFM\heapdump.20180726.162019.376.0002.phd
JVMDUMP032I La machine virtuelle Java a demand� un vidage Java en utilisant 'C:\PFD\Data\git\eFM\javacore.20180726.162019.376.0003.txt' en r�ponse � un �v�nement
JVMDUMP010I Vidage Java �crit dans C:\PFD\Data\git\eFM\javacore.20180726.162019.376.0003.txt
JVMDUMP032I La machine virtuelle Java a demand� un vidage Snap en utilisant 'C:\PFD\Data\git\eFM\Snap.20180726.162019.376.0004.trc' en r�ponse � un �v�nement
JVMDUMP010I Vidage Snap �crit dans C:\PFD\Data\git\eFM\Snap.20180726.162019.376.0004.trc
JVMDUMP013I Ev�nement de vidage trait� "systhrow", d�tails "java/lang/OutOfMemoryError".
2018-07-26 16:20:23.620 ERROR 376 --- [nio-9081-exec-3] o.a.c.c.C.[.[.[/].[dispatcherServlet]    : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.OutOfMemoryError: Espace de segment de mémoire Java] with root cause

java.lang.OutOfMemoryError: Espace de segment de mémoire Java
at java.lang.StringBuilder.ensureCapacityImpl(StringBuilder.java:366) ~[na:1.8.0]
at java.lang.StringBuilder.append(StringBuilder.java:232) ~[na:1.8.0]
[...]


fauxTestSegment-20_000_lignes.txt
SEGMENT.xml
Message has been deleted

Vidushi Bassi

unread,
Sep 11, 2018, 9:17:47 AM9/11/18
to beanio-users
Hi Steven,

I am facing similar issue. I need to parse a file with million records in it and the group size can be more than 10000. Did you find any solution for this ?

Thanks in advance.

Regards,
Vidushi
Reply all
Reply to author
Forward
0 new messages