Safest Size of Data Limit To Use Bulk Import In ArangoDB

196 views
Skip to first unread message

shivaraj naidu

unread,
Aug 30, 2016, 4:01:47 AM8/30/16
to ArangoDB
Recently I have came across the situation where I need to bulk import data from csv file size of 10 MiB (with arangoJS on NodeJS project)...

And Works fine while using Bulk Import..
But My colic said don't import that much data once...

Because that will affect process/performance negatively..

So He said "Do Import 100 records after 100 records"...

So I want to know is there any safer limit is there for Bulk Import or I can use it to import as much data as i can..

And If 100 after 100 is the safest way.. then there is any easy way to do this on array of docs with AQL

Jan

unread,
Aug 30, 2016, 4:19:39 AM8/30/16
to ArangoDB
Hi,

in 3.0 the arangod server has a request body size limit of 512 MB, so all requests sent to ArangoDB should be smaller at most 512 MB big. That is the upper limit.
In practice it will be much better to send batches of 100 or 1,000 documents at a time until you have reached the end of the input data.
This way it is very unlikely you will ever hit the request body size limit.
Whether it's 100 or 1,000 or any other number of documents at a time is up to you. The smaller the documents are, the more you can put into a batch.

To import multiple documents with AQL, you can do the following:

    FOR doc IN @docs
      INSERT doc INTO collection

And pass the documents as an array in bind parameter @docs, e.g.

    [ { "value1": "test", "value2": "something" }, { "foo": "bar", "baz": "bat" }, ... ]

Best regards
J
Reply all
Reply to author
Forward
0 new messages