Hi,
I want to measure the time for the insertion of a block, where the block is consisted from 1,000 nodes and the corresponding edges. To be more clear my dataset is an edge list and when I find a new node I add to the database. Every line is an edge so in every line I add and edge. When 1,000 nodes have added I print the needed time. I want to commit every time a node or edge is added, in order to simulate a single insertion. My problem is that the memory consumption is too much. It needs more that 3GB of ram for a graph of approximately 403,394 nodes and 3,387,388 edges. I think that the problem comes from the index that is stored in memory. Is there any better way?
Thanks in advance.