Hi,
I am trying to bulk load an extremely large file(10 million triples) into Stardog, the file had some errors so after loading the file i deleted the database, made changes to the file and then loaded it again. I repeated this process 7-8 times. But now after loading 200,000 triples I get the error message: java.lang.OutOfMemoryError: Direct buffer memory.
I realize this was not the ideal way to use the software. I have fixed the file now completely outside the software, is there anything I can do now to be able to load the file into stardog again?
Here is the message i got:
./stardog-admin db create -n sample /home/usr/sample.nt
Bulk loading data to new database.
Parsing triples: 100% complete in 00:00:02 (270K triples - 100.0K triples/sec)
Parsing triples finished in 00:00:02.701
Creating index: 100% complete in 00:00:00 (1478.0K triples/sec)
Creating index finished in 00:00:00.180
Computing statistics: 100% complete in 00:00:00 (2268.6K triples/sec)
Computing statistics finished in 00:00:00.117
Loading complete.
Inserted 265,426 unique triples from 266,040 read triples in 00:00:03.941 at 67.5K triples/sec
Bulk load complete. Loaded 265,426 triples from 1 file(s) in 00:00:03 @ 67.5K triples/sec.
Errors were encountered during loading:
File: /home/usr/sample.nt Message: java.lang.OutOfMemoryError: Direct buffer memory
Successfully created database 'sample'.
Any help will be greatly appreciated!