--
-- --
You received this message because you are subscribed to the C&P "Stardog" group.
To post to this group, send email to sta...@clarkparsia.com
To unsubscribe from this group, send email to
stardog+u...@clarkparsia.com
For more options, visit this group at
http://groups.google.com/a/clarkparsia.com/group/stardog?hl=en
This is how i m creating Database.Create DB:stardog-admin db create -n myDbAdd graphstardog data add --named-graph http://localhost:5822/fub_byrn myDb file.ttlFile.ttl is small file (90mb) than it accept it. but when i file.ttl is large arround 1gb it give Jave heap error and crash myDb.what is wrong in this query ?Can some one correct it ?
On Wednesday, July 24, 2013 7:57:29 PM UTC+2, Asher Baig wrote:Hi,when i try to load big rdf file of 5gb.it started given me this error .[WARNING org.jboss.netty.channel.socket.nio.AbstractNioWorker.null - Jul 24, 2013 07:50:09.526] Unexpected exception in the selector loop.java.lang.OutOfMemoryError: GC overhead limit exceeded[WARNING org.jboss.netty.channel.socket.nio.AbstractNioWorker.null - Jul 24, 2013 07:50:18.520] Unexpected exception in the selector loop.java.lang.OutOfMemoryError: Java heap spaceIs stardog doesnot support Big files ?
--
On Sun, Jul 28, 2013 at 1:04 PM, Asher Baig <ashe...@gmail.com> wrote:
This is how i m creating Database.Create DB:stardog-admin db create -n myDbAdd graphstardog data add --named-graph http://localhost:5822/fub_byrn myDb file.ttlFile.ttl is small file (90mb) than it accept it. but when i file.ttl is large arround 1gb it give Jave heap error and crash myDb.what is wrong in this query ?Can some one correct it ?I guess my previous email where I explained exactly what the issue with what you were attempting was unclear, so I'll try again.Adding *very* large amounts of data post-creation keeps the transaction in-memory; thus, adding 10s of millions of triples, or more, via add *after* you have created a database has a large memory requirement and is generally not suitable for bulk-adding data. The good news is that Stardog is perfectly capable of bulk adding billions of triples when you create the database. So if you're going to be adding 10M triples (rough estimate of what 1G of data would be), as I've already explained, you should be adding it when you *create* the database so the bulk loader can be used.