Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Handling OutOfMemory Error

4 views
Skip to first unread message

ruds

unread,
Sep 16, 2008, 2:49:25 AM9/16/08
to
Hi,
I'm operating on a 600MB file to extract some data from it.
I'm using ArrayList and HashTable for storing the values extracted
from the file and manipulating the data obtained.
When I try executing the program I'm getting the following error:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Unknown Source)
at java.lang.AbstractStringBuilder.expandCapacity(Unknown
Source)
at java.lang.AbstractStringBuilder.append(Unknown Source)
at java.lang.StringBuffer.append(Unknown Source)
at
Analysis_Summary_Variable.getdata(Analysis_Summary_Variable.java:61)
at
Analysis_Summary_Variable.main(Analysis_Summary_Variable.java:549)

How should I increase the JVM's memory to handle such large amount of
data?
Is there any other way around it?

softwarepearls_com

unread,
Sep 16, 2008, 3:21:05 AM9/16/08
to

The -Xmx command line flag lets you set the heap size. See java.exe
docs dor details..

EJP

unread,
Sep 16, 2008, 4:32:52 AM9/16/08
to
ruds wrote:
> I'm using ArrayList and HashTable for storing the values extracted
> from the file and manipulating the data obtained.

Don't.

Use a database.

Christian

unread,
Sep 16, 2008, 6:07:28 AM9/16/08
to
ruds schrieb:

depends.. either use a database.
Or if you really need the speed and have the RAM use
-Xmx 1024MB

Though be aware. Hashmaps have sometimes a really large overhead. for
storing 100 MiB of ints in a hasmap you might need more than 2 GiB of RAM

So if you have such large ammounts of data some more flat structure that
is less object oriented might help.
Espeically as the build in collections only handle Integers and not ints
... even the Overhead of an ArrayList might be too much for you.

So it all depends on what data you have.

Christian

Tom Anderson

unread,
Sep 16, 2008, 12:24:44 PM9/16/08
to

-Xmx, as others have mentioned.

> Is there any other way around it?

Firstly, the stacktrace there is from a StringBuffer which is trying to
expand itself. If you can work out ahead of time how big that StringBuffer
eventually needs to be, or even put a useful upper bound on it, then you
can create the StringBuffer with that much capacity in the first place,
which will avoid the need to expand it, and might avoid that failure. It
should certainly improve performance.

I'd also look at whether you need to use that buffer at all - you're not
doing something like reading the entire file line by line and putting it
in the buffer, are you? If you are, find an alternative! In general, a
great way to reduce memory use is to find ways of doing things
incrementally, so you don't need to have all your data in memory at once.
For instance, if you were adding up all the numbers in a file, you might
do:

make an empty list of numbers
for line in file:
parse the line to a number
put the number in the list
set the total to 0
for number in list:
add the number to the total
report the total

But you'd use far less memory like this:

set the total to 0
for line in file:
parse the line to a number
add the number to the total
report the total

That's an obvious and trivial and example, and i imagine your example does
not admit such easy improvements. However, with the application of a
sufficient amount of cleverness, some degree of incrementalisation is
often possible.

Otherwise, whilst ArrayList is fine, the HashMap would worry me slightly
in terms of space. How are you using it? Is the set of keys the same or
similar between records? Could you use an object instead of a hashmap?

If you tell us more about your program, we can give you more specific
help.

tom

--
Any problem in computer science can be solved with another layer of
indirection. -- David Wheeler

0 new messages