mongoimport -d fastqdb -c seqcol --file /Users/loginname/FASTQ.json --jsonArrayconnected to: 127.0.0.1 2014-07-29T18:06:54.311+0200 Progress: 31598/432568845 0% 2014-07-29T18:06:54.311+0200 200 66/second 2014-07-29T18:06:58.207+0200 Progress: 78998/432568845 0% 2014-07-29T18:06:58.208+0200 500 71/second ........................... ............................... 2014-07-30T13:42:32.027+0200 Progress: 138486966/432568845 32% 2014-07-30T13:42:32.027+0200 860200 12/second 2014-07-30T13:42:36.004+0200 Progress: 138534798/432568845 32% 2014-07-30T13:42:36.004+0200 860500 12/secon
details of my PC Memory 4 GB 1067 MHz DDR3 Processor Name: Intel Core 2 Duo Processor Speed: 2.66 GHz
Tis is my first interaction with mongodb,
My question is, what are the exact steps to followwhat it is taking so long to import to data mongodb
--
You received this message because you are subscribed to the Google Groups "mongodb-user"
group.
For other MongoDB technical support options, see: http://www.mongodb.org/about/support/.
---
You received this message because you are subscribed to the Google Groups "mongodb-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mongodb-user...@googlegroups.com.
To post to this group, send email to mongod...@googlegroups.com.
Visit this group at http://groups.google.com/group/mongodb-user.
To view this discussion on the web visit https://groups.google.com/d/msgid/mongodb-user/8f776d15-5873-4d3d-8ce6-edac1a4b02dc%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
If you have indexes on the collections, then remove them before loading the data. Maintaing unique index whilst importing data of that size can eat a lot of memory, 1tb of data sitting in 4gb of memory means that if it has to access any significant number of index pages to add the data its going to eat all the memory very very fast.
Once the data is imported then add the indexes back in.