ich habe ein Problem mit meiner H2 DB.
Es gibt eine Tabelle Namens MWI5REP, welche 25493 Datensätze enthält.
Sofern ich ein Select * mache, wirft mir meine Anwendung eine
Exception:
org.h2.jdbc.JdbcSQLException: Allgemeiner Fehler:
"java.lang.ArrayIndexOutOfBoundsException: 512"
General error: "java.lang.ArrayIndexOutOfBoundsException: 512"; SQL
statement:
SELECT MWI5REP.* FROM MWI5REP WHERE MWI5REP.I5RCTX = 100 AND
MWI5REP.I5D0NB = 677 AND I5CQST = 'F' AND I5W5NB > 0 ORDER BY I5RCTX,
I5D0NB, I5CRNB, I5W3NB, I5W4NB, I5XXTX [50000-126]
at org.h2.message.Message.getSQLException(Message.java:110)
at org.h2.message.Message.convert(Message.java:287)
at org.h2.message.Message.convert(Message.java:248)
at org.h2.command.Command.executeQuery(Command.java:134)
at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:76)
....
Caused by: java.lang.ArrayIndexOutOfBoundsException: 512
at org.h2.store.DataPage.writeInt(DataPage.java:139)
at org.h2.store.DataPage.writeValue(DataPage.java:415)
at org.h2.result.ResultDiskBuffer.addRows(ResultDiskBuffer.java:93)
at org.h2.result.LocalResult.addRowsToDisk(LocalResult.java:285)
at org.h2.result.LocalResult.addRow(LocalResult.java:280)
at org.h2.command.dml.Select.queryFlat(Select.java:499)
at org.h2.command.dml.Select.queryWithoutCache(Select.java:558)
at org.h2.command.dml.Query.query(Query.java:243)
at org.h2.command.CommandContainer.query(CommandContainer.java:81)
at org.h2.command.Command.executeQuery(Command.java:132)
... 41 more
Gibt es eine Möglichkeit diesen Fehler zu unterbinden ?
Ein setMaxRows auf 1000 Zeilen oder VMARGS (xmx und xms auf 1024)
brachten auch keinen Erfolg...
Über eine Hilfe wäre ich dankbar !
Gruß
gez.
Reiner Lott
(Es tut mir leid. Meine Deutsch is sehr schlect aber ich will Sie
helfen.)
Die Fehler is sehr schlimm aber Sie muss uns Information geben.
Ist Ihre H2 Version alt?
Was sind die Einstellungen?
Könnten Sie ein kurz Testfall geben?
-- Sam Van Oort
(Es tut mir leid. Meine Deutsch is sehr schlect aber ich will Sie
helfen.)
Die Fehler is sehr schlimm aber Sie muss uns Information geben.
Ist Ihre H2 Version alt?
Was sind die Einstellungen?
Könnten Sie ein kurz Testfall geben?
-- Sam Van Oort
On Jan 5, 1:20 am, Bratmaxxe <Reiner.L...@gmx.de> wrote:
ok let's try to solve the problem ;)
I am using H2 Version 1.2.126 in Embedded Mode (org.h2.Driver) -
connectionstring : "jdbc:h2:C:\dblocation\mydb"
I have some tables in the DB, and the problem occurs in the table
named "MWI5REP", which contains 25493 Rows.
If i do a simple select * my java program throws the following
exception:
....
Is there anything i can do, to prohibit the problem ?
I tried to setMaxRows to 1000 or used VMARGS (increased XMX and xms
memory to 1024) but without success !
Thanks in advantage
sincerely
Reiner Lott
> > Reiner Lott- Zitierten Text ausblenden -
>
> - Zitierten Text anzeigen -
i think i found the problem.
I used following fix:
// FIX --> RuntimeException: freeCount expected
// The problem occurs in Storage.refillFreeList(). This method was
introduced in
// version 1.0.76 (2008-07-27) to solve the problem "The database
file
// was growing after deleting many rows, and after large update
operations".
System.setProperty("h2.check", "false");
After deleting this code snippet, everything seemd to work fine...
Greetings
Reiner
> > - Zitierten Text anzeigen -- Zitierten Text ausblenden -
I can understand your German okay with the aid of a dictionary, and it
is good to practice reading Deutch. Feel free to continue in that
language if you prefer.
Where did you find the fix you quoted? I'm not getting any results in
the documentation, groups, or code. In any case, disabling the check
may prevent the error, but I think there is still a problem with the
H2 program.
Could you post a self-contained test case which causes the original
problem? Something short and simple maybe?
Cheers,
Sam
after deleting and inserting a large amount of rows (deleting 300 000
and after that writing 300 000 rows) (using 1.0.76 (2008-07-27) ) i
got a RuntimeException: freeCount expected in my Java program.
So i asked google with this problem and found the following thread:
http://groups.google.com/group/h2-database/browse_thread/thread/72ec33358d275b8a
// QUOTE
Hi,
> Caused by: java.lang.RuntimeException: freeCount expected 0, got: 448
> at org/h2/message/Message.getInternalError (Message.java:179)
> at org/h2/store/Storage.refillFreeList (Storage.java:247)
> at org/h2/store/Storage.allocate (Storage.java:254)
> at org/h2/store/Storage.addRecord (Storage.java:182)
> at org/h2/index/ScanIndex.add (ScanIndex.java:123)
> at org/h2/table/TableData.addRow (TableData.java:108)
> ... 15 more
> Our jdbc url is:
> jdbc:h2:rundir\db
> \db;LOG=2;MAX_LOG_SIZE=1;DB_CLOSE_DELAY=-1;TRACE_MAX_FILE_SIZE=1,username=sa,password=
Thanks! Unfortunately I couldn't reproduce this problem - do you have
a simple test case where this problem occurs?
> In another post here, it is mentioned that "It is not critical at
> runtime if freeCount is incorrect". So, is there a way to configure
> the database, so that this check is not performed (at least for the
> time being).
Yes, set the system property h2.check to false: java -Dh2.check=false
... or System.setProperty("h2.check", "false") before loading the
database driver.
> Also, is it possible for us to maybe detect this
> condition and somehow repair the database?
The database is not broken, there is no need to repair. It is
something in the code I like to fix.
Regards,
Thomas
// QUOTE
So i used this workaround in my program, to prohibit the runtime
exception...
After doing some tests, i got now a jeave heap exception or database
already closed error , after deleting and inserting a large amount of
rows.
01-05 10:36:37 jdbc[2]: SQLException
org.h2.jdbc.JdbcSQLException: Die Datenbank ist bereits geschlossen
The database has been closed [90098-126]
at org.h2.message.Message.getSQLException(Message.java:110)
at org.h2.message.Message.getSQLException(Message.java:121)
at org.h2.message.Message.getSQLException(Message.java:74)
at org.h2.message.Message.getSQLException(Message.java:156)
at org.h2.engine.Database.checkPowerOff(Database.java:468)
at org.h2.command.Command.executeUpdate(Command.java:227)
at org.h2.jdbc.JdbcStatement.executeUpdateInternal(JdbcStatement.java:
124)
at org.h2.jdbc.JdbcStatement.executeUpdate(JdbcStatement.java:109)
...
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown
Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
If i use vmargs like: -Xms256m -Xmx1024m this problem does not
appear.
wheres the mistake ? can i prohibit this without setting the memory
for the jvm ?
Cheers,
Reiner
anyone no idea ?
Cheers Reiner
On 5 Jan., 11:01, Bratmaxxe <Reiner.L...@gmx.de> wrote:
> Hi Sam,
>
> after deleting and inserting a large amount of rows (deleting 300 000
> and after that writing 300 000 rows) (using 1.0.76 (2008-07-27) ) i
> got a RuntimeException: freeCount expected in my Java program.
>
> So i asked google with this problem and found the following thread:
>
> http://groups.google.com/group/h2-database/browse_thread/thread/72ec3...
Yes, the original error happend because h2.check was set to false.
This is also something I will fix in the next release (you should be
able to set h2.check to false).
> After doing some tests, i got now a jeave heap exception or database
> already closed error , after deleting and inserting a large amount of
> rows.
>
> 01-05 10:36:37 jdbc[2]: SQLException
> org.h2.jdbc.JdbcSQLException: Die Datenbank ist bereits geschlossen
> The database has been closed [90098-126]
In version 1.2.126, the database is closed when the system ran out of
memory (to avoid database corruption). So the question is why did the
JVM run out of memory. It's hard to say without more details. Maybe I
will even need a heap dump. To get that, you could you run the
application with java -XX:+HeapDumpOnOutOfMemoryError - see also
http://blogs.sun.com/alanb/entry/heap_dumps_are_back_with - and post /
upload the heap dump (or analyze it yourself). For large attachments,
use a public temporary storage such as Rapidshare - see also
http://www.h2database.com/html/build.html#support
Regards,
Thomas
I went hunting for the source of the problem (sorry for the delay,
Rheiner). Here's a test case that will trigger an OutOfMemoryError --
I believe it may be related to Issue 157. Something is buggy with the
storage routines, definitely.
--Create new DB with URL: jdbc:h2:F:/Temp/DB/test
--Run the following SQL
CREATE TABLE testRows(key IDENTITY primary key, value INTEGER);
INSERT INTO testRows(value) SELECT X FROM SYSTEM_RANGE(1,10000000);
Eats 60-90% of a CPU core, writes to temp files at about 50-150 kB/s.
Writes about 530 MB of data.... then OutOfMemoryError! JVM is allowed
about 96 MB of heap.
-- Create new DB with URL: jdbc:h2:F:/Temp/DB/test;PAGE_STORE=FALSE
--Run the following SQL
CREATE TABLE testRows(key IDENTITY primary key, value INTEGER);
INSERT INTO testRows(value) SELECT X FROM SYSTEM_RANGE(1,10000000);
Takes *forever* and a half to run. Seems to write to temp files at
about several kB/s, and eats a full CPU core while running. Aborted
after nearly one hour. Total data written to temp files: 223 MB.
Test system is a 2 year-old laptop, so while I expect slowness, this
level of performance is clearly a bug. System can read/write at least
40 MB/s to HD consistently.
I'm going to run some profiling to see why these are running so
slowly, and what kinds of object are eating all the memory. Maybe
I'll find the bug. Results soon, I hope? My money says you'll
probably have fixed the problem before I can find it though. *grin*
Cheers,
Sam Van Oort
On Jan 6, 1:05 pm, Thomas Mueller <thomas.tom.muel...@gmail.com>
wrote:
> Hi,
>
> Yes, the original error happend because h2.check was set to false.
> This is also something I will fix in the next release (you should be
> able to set h2.check to false).
>
> > After doing some tests, i got now a jeave heap exception or database
> > already closed error , after deleting and inserting a large amount of
> > rows.
>
> > 01-05 10:36:37 jdbc[2]: SQLException
> > org.h2.jdbc.JdbcSQLException: Die Datenbank ist bereits geschlossen
> > The database has been closed [90098-126]
>
> In version 1.2.126, the database is closed when the system ran out of
> memory (to avoid database corruption). So the question is why did the
> JVM run out of memory. It's hard to say without more details. Maybe I
> will even need a heap dump. To get that, you could you run the
> application with java -XX:+HeapDumpOnOutOfMemoryError - see alsohttp://blogs.sun.com/alanb/entry/heap_dumps_are_back_with- and post /
UndoLogRecord instances accounted for 82.4% of memory used before I
ran out, with org.h2.result.Row instances as another 8%, and the
remainder being various things (Value[], etc).
I suspect there are other things that will limit the size of an
update, but requiring 24 bytes for every row is a definite hard limit
-- and I hope should be simple to fix, since there's no reason all
those objects need to be in memory at once?
***The (partial) stack trace for allocation is:***
org.h2.command.Command.executeUpdate()
org.h2.command.CommandContainer.update()
org.h2.command.dml.Insert.update()
org.h2.command.dml.Insert.insertRows()
org.h2.engine.Session.log(org.h2.table.Table, short,
org.h2.Result.Row)
Examining the heap confirms the profiler result. I can provide the
results, but it's easier to just create a 6-liner java program
executing the commands in my previous post, and run it with a memory
profiler.
> > application with java -XX:+HeapDumpOnOutOfMemoryError - see alsohttp://blogs.sun.com/alanb/entry/heap_dumps_are_back_with-and post /
> I found (one) cause of OutOfMemoryError with large updates using the
> new page store. UndoLogRecord instances pile up in memory,
See also the roadmap "Support large inserts and updates (use the
transaction log for rollback)".
Regards,
Thomas