If you're storing that much data and it's purely in-memory, why do you need a database?
Would be less overhead to store it in an ArrayList or HashMap.
Also, at the scale, you are going to have significant GC pauses.
You might want to check out the heap storage solutions like
http://terracotta.org/products/bigmemory
and
memcached
> --
> You received this message because you are subscribed to the Google
> Groups "H2 Database" group.
> To post to this group, send email to h2-da...@googlegroups.com.
> To unsubscribe from this group, send email to
> h2-database...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/h2-database?hl=en.
what was your connect string? Do you try to insert all data in one
transaction or do you flush the data from time to time? I rember
somwhere else was mentioned that h2 itself holds some data in memory
while a transaction is running and might exceed gc limit when inserting
large very datasets.
Am 09.01.2012 07:48, schrieb Karun:
> Hi Christoph,
>
> Thanks for your suggestion.
>
> I tried to use dev/shm in linux machine to populate more data using in-
> memory database.
>
> But was able to load only 1 million data only.after that i get
> JDBCException "OutOfMemoryError: GC overhead limit exceeded".
>
> I tried these below steps:
> 1)install db under dev/shm
> 2)started server from this directory in one process
> 3) loaded data(using .csv file read) from same machine with another
> process.
>
> Can you please help me out if there is anything I am missing when i
> use dev/shm directory.
> or kindly tell me if in-memory db can hold more data(approx 160
> million data )
>
> Thanks in advance.
>
> Regards,
> Karun
>
>
>
>
>
> On Jan 5, 5:32 pm, Karun<karun.moha...@gmail.com> wrote:
>
>> Hi
>>
>> Thanks for quick reply.
>>
>> Regards,
>> Mohanty
>>
of course you can create an own folder for the db, just use the shm like
a regular harddisk but keep in mind the content is lost whenever you
reboot or power off our pc