rhits = index.query("cs", "*");
out.println("#" + rhits.size());
rhits.close();
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.neo4j.collection.primitive.hopscotch.IntArrayBasedKeyTable.initia
lizeTable(IntArrayBasedKeyTable.java:54)
at org.neo4j.collection.primitive.hopscotch.IntArrayBasedKeyTable.<init>
(IntArrayBasedKeyTable.java:48)
at org.neo4j.collection.primitive.hopscotch.LongKeyTable.<init>(LongKeyT
able.java:27)
at org.neo4j.collection.primitive.Primitive.longSet(Primitive.java:66)
at org.neo4j.kernel.impl.coreapi.LegacyIndexProxy$1.<init>(LegacyIndexPr
oxy.java:296)
at org.neo4j.kernel.impl.coreapi.LegacyIndexProxy.wrapIndexHits(LegacyIn
dexProxy.java:294)
at org.neo4j.kernel.impl.coreapi.LegacyIndexProxy.query(LegacyIndexProxy
.java:352)
--
You received this message because you are subscribed to the Google Groups "Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email to neo4j+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
I am very sorry to point out that those Lucene queries actually have a changeable behaviour, usually they are slower than the past, and in most case I get that error again (using 6GB heap of 8GB total RAM).
I am updating the graph with delete and update of nodes and relationships. I decreased the number of operations per transaction but most of times I still got this error.
The insertion with Batch Inserter instead seems to be ok!
What do you suggest me please? Keep 1.9.9 version or upgrade to 2.2.1 could be a solution? The new version covers packages related to these problems?
Thanks in advance
Rita