OutOfMemoryError while using BatchInsert

15 views
Skip to first unread message

Sukaant Chaudhary

unread,
Nov 3, 2015, 9:56:52 AM11/3/15
to ne...@googlegroups.com
Hi,
I'm using BatchInsert for inserting 30 million records, for 10 million it is working fine but when I'm trying to insert 20 million records. I'm getting the following error.

Exception in thread "main" java.lang.OutOfMemoryError: Requested array size exceeds VM limit
        at java.util.Arrays.copyOf(Arrays.java:2271)
        at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113)
        at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
        at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140)
        at java.io.OutputStream.write(OutputStream.java:75)
        at com.amat.hbase.dao.impl.ItemDimDAOImpl.getItemDetail(ItemDimDAOImpl.java:49)
        at com.amat.transfer.ImpalaTransferStart.main(ImpalaTransferStart.java:16)

Any help will be highly appreciated.

-Sukaant Chaudhary

Michael Hunger

unread,
Nov 3, 2015, 7:00:15 PM11/3/15
to ne...@googlegroups.com
You have to share your code. It looks as if your dao is writing to a stream which is actually a byte-array in memory. Which cannot grow anymore at some point.


-Sukaant Chaudhary
<image004.png>

--
You received this message because you are subscribed to the Google Groups "Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email to neo4j+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Sukaant Chaudhary

unread,
Nov 4, 2015, 9:26:47 PM11/4/15
to ne...@googlegroups.com
Thanks Michael, I've solved this issue.

-Sukaant Chaudhary

Reply all
Reply to author
Forward
0 new messages