Using cursors to read a large dataset

69 views
Skip to first unread message

Adam Hammond

unread,
Nov 16, 2016, 5:43:06 AM11/16/16
to node-ibm_db
I am trying to migrate a large dataset (the SQL query returns around 700,000 rows) to a Cloudant database on Bluemix. The challenge I have is that the dataset is too large to load into memory so the process fails.

Is there a way of defining a cursor for the query so I can process the rows one at a time without loading the whole set into memory?

Effectively I want to do something like:

Define cursor for query
Open cursor
While still rows to process
   Read row from DB2
   Write to Cloudant database
Close cursor

I know that you can do this in Java, but I don't see a way of doing this in Node.JS

bimaljha

unread,
Nov 16, 2016, 11:09:42 AM11/16/16
to node-ibm_db
Currently node-ibm_db has no support for cursor as you described. Thanks.

bimaljha

unread,
Nov 18, 2016, 8:45:43 AM11/18/16
to node-ibm_db
Use below test.js file. It returns a single row at once that can be processed by application in loop. It should server your purpose. Thanks.

var ibmdb = require("ibm_db")

  , conn = new ibmdb.Database()

  , cn = "database=sample;hostname=hotel.torolab.ibm.com;port=50000;uid=newton;pwd=xxxxx"

  ;


conn.openSync(cn);

conn.querySync("create table mytab (c1 int, c2 varchar(20))");

conn.querySync("insert into mytab values (1, 'bimal'),(2, 'kamal'),(3,'mohan'),(4,'ram')");


conn.queryResult("select * from mytab", function (err, result) {

  if(err) {

    console.log(err);

    return;

  }

 

  var data;

  while( data = result.fetchSync() )

  {

    console.log(data);

  }

  conn.querySync("drop table mytab");

  conn.closeSync();

});

Reply all
Reply to author
Forward
0 new messages