Node mysqljs. How to properly close connections

3,754 views
Skip to first unread message

Ken Ingram

unread,
Dec 9, 2019, 10:28:06 PM12/9/19
to node-mysql

I'm using mysqljs to access MySQL with javascript.

I would like to point out that this process seems to work fine if a single piece of data. I am feeding into my code a large set of data, to be processed line by line as a batch.

I create my connection like this:

var connection = mysql.createConnection({
    //debug: ['ComQueryPacket'],
    host        : dataSource.host,
    user        : dataSource.user,
    password: dataSource.password,
    database: dataSource.database
});

I have three functions that make database queries.

The function containing the SELECT query is built like this:

dbSearch(data){
  var sql = "SELECT * from table where field =? and otherfield=?";
  return new Promise((resolve, reject) => {
    connection.query(sql, [data[0], data[1], (error, results, fields) => {
        if (error){
          console.log(error);
          reject("Database connection error: " + error);
         } else {
          resolve(results);
         }  
    });
  });
}

The code executes in another function:

if (dataItem){
  dbSearch(dataItem)
    .then((row) => {
        processingfunction(row);
    });

If I leave out connection.end() the code hangs and the stream of data is held up at the first item being processed.

If I put connection.end() inside the function, i get this error:

Database connection error: Error: Cannot enqueue Query after invoking quit.

I put connection.end() as the last line of the code, everything works fine

The problem though is for the update and insert functions:

updateRecord(data){
  var sql = "UPDATE table set field=? where id=?";
  return new Promise((resolve, reject) => {
    connection.query(sql, [data[0], data[1], (error, results, fields) => {
     if (error){
      console.log(error);
      reject("Database connection error: " + error);
     } else {
      resolve(results);
     }
    });
  });
}

inputRecord(data){
  var sql = "INSERT INTO table (field1, field2, field3) VALUES(?,?,?)";
  return new Promise((resolve, reject) => {
    connection.query(sql, [data[0], data[1], data[2]], (error, results, fields) => {
      if (error){
        console.log(error);
        reject("Database connection error: " + error);
      } else {
        resolve(results);
      } 
    });
  });
}

With connection.end() in the function I get this error.

Database connection error: Error: Cannot enqueue Query after invoking quit.
(node:40700) UnhandledPromiseRejectionWarning: Database connection error: Error: Cannot enqueue Query after invoking quit.
(node:40700) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:40700) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

Based on the documentation, I have no clarity on how to properly handle closing the connection so that the code can process properly.

Not sure what I am doing wrong. Could use some mentorinng from someone experienced with using the connections to process chunks of data and how to properly handle closing the connections?

NOTE: A similar problem happens when I try connection pooling, so that was not a workable solution.

Ryan Lee

unread,
Dec 10, 2019, 12:10:59 AM12/10/19
to node-...@googlegroups.com
  1. How big is your data you're trying to load? If you try to load to large set of data, the Node.js interpreter will choke. You'll need to use the streaming api for larger data sets. To test this theory, throw a limit on your select queries and see if that's the issue.
  2. How are you driving your queries? Remember MySQL is synchronous and JS is async. If your second query relies on the first query to finish first, you can hang. You need to "promise chain" them. I totally recommend bluebird for this. You don't have to deal with all the wrapping yourself. Check it out:
connection
  .queryAsync("SELECT * FROM blah") //This function is exposed as part of blubird's promsification 
  .each((row) => {  //bluebird's each allows you loop through your rows sequentially
     return connection.queryAsync("INSERT INTO fubar SET ?", formattedRow)  //which you can now operate on each row in order
  })
  .finally(() => {
    connection.end();
  });

And if you're feeling really dirty, you can always process.exit()
dbSearch(dataItem)
    .then((row) => { //remember this might an array of rows.
        return processingfunction(row); //return a promise here
    }).then(() => {
       connection.end(); //close the connection after you've run all your queries
    });

--Ryan

On Mon, Dec 9, 2019 at 7:28 PM Ken Ingram <ken.i...@gmail.com> wrote:

I'm using mysqljs to access MySQL with javascript.

I would like to point out that this process seems to work fine if a single piece of data. I am feeding into my code a large set of data, to be processed line by line as a batch.

How big? If you're talking a lot of data, you need to stream as you'll run out of memory in your JS thread. 
How are you streaming data? 
--
You received this message because you are subscribed to the Google Groups "node-mysql" group.
To unsubscribe from this group and stop receiving emails from it, send an email to node-mysql+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/node-mysql/b6a78a3a-0abb-4bba-88a9-360bc95855c9%40googlegroups.com.

Ken Ingram

unread,
Dec 10, 2019, 3:22:25 AM12/10/19
to node-...@googlegroups.com
On Mon, Dec 9, 2019 at 9:11 PM Ryan Lee <thesand...@gmail.com> wrote:
  1. How big is your data you're trying to load? If you try to load to large set of data, the Node.js interpreter will choke. You'll need to use the streaming api for larger data sets. To test this theory, throw a limit on your select queries and see if that's the issue.
The input data is 400 lines of innformation, each needing processing.
At most 50 lines of data need update or insert.
 
  1. How are you driving your queries? Remember MySQL is synchronous and JS is async. If your second query relies on the first query to finish first, you can hang. You need to "promise chain" them. I totally recommend bluebird for this. You don't have to deal with all the wrapping yourself. Check it out:
connection
  .queryAsync("SELECT * FROM blah") //This function is exposed as part of blubird's promsification 
  .each((row) => {  //bluebird's each allows you loop through your rows sequentially
     return connection.queryAsync("INSERT INTO fubar SET ?", formattedRow)  //which you can now operate on each row in order
  })
  .finally(() => {
    connection.end();
  });

Oh. This is interesting. Perhaps you have hit on the flaw in my code. Perhaps I need to run .finally()?

 
And if you're feeling really dirty, you can always process.exit()
dbSearch(dataItem)
    .then((row) => { //remember this might an array of rows.
        return processingfunction(row); //return a promise here
    }).then(() => {
       connection.end(); //close the connection after you've run all your queries
    });
--Ryan

Ha! "Dirty Ken" may be my nickname
 . Or Ol' Dirty Kenstard

I'm going to see if .finally() on the promise chain helps.

Ken Ingram

unread,
Dec 10, 2019, 3:32:08 AM12/10/19
to node-...@googlegroups.com
On Tue, Dec 10, 2019 at 12:21 AM Ken Ingram <ken.i...@gmail.com> wrote:


On Mon, Dec 9, 2019 at 9:11 PM Ryan Lee <thesand...@gmail.com> wrote:
  1. How big is your data you're trying to load? If you try to load to large set of data, the Node.js interpreter will choke. You'll need to use the streaming api for larger data sets. To test this theory, throw a limit on your select queries and see if that's the issue.
The input data is 400 lines of innformation, each needing processing.
At most 50 lines of data need update or insert.
 
  1. How are you driving your queries? Remember MySQL is synchronous and JS is async. If your second query relies on the first query to finish first, you can hang. You need to "promise chain" them. I totally recommend bluebird for this. You don't have to deal with all the wrapping yourself. Check it out:
The queries are independent. Inserts and Updates exist in their own universe.

 
And if you're feeling really dirty, you can always process.exit()
dbSearch(dataItem)
    .then((row) => { //remember this might an array of rows.
        return processingfunction(row); //return a promise here
    }).then(() => {
       connection.end(); //close the connection after you've run all your queries
    });
--Ryan

Sadly, any effort to manually close the connection results in 

Processing Completed ++++++++++++++++++++++++++++++++++>
  events.js:174
      throw er; // Unhandled 'error' event
      ^

Error: Cannot enqueue Quit after invoking quit.
    at Protocol._validateEnqueue (project\node\node_modules\mysql\lib\protocol\Protocol.js:215:16)
    at Protocol._enqueue (project\node\node_modules\mysql\lib\protocol\Protocol.js:138:13)
    at Protocol.quit (project\node\node_modules\mysql\lib\protocol\Protocol.js:91:23)
    at Connection.end (project\node\node_modules\mysql\lib\Connection.js:242:18)
    at checkRecord.then.finally (project\node\utilities.js:302:32)

Ken Ingram

unread,
Dec 12, 2019, 9:16:06 PM12/12/19
to node-mysql
What does it take to actually have a conversation that leads to me resolving my problem?

The documentation is not providing any insight as to why I cannot close my connection without getting this error:

UPDATE thedata SET field = 'information' WHERE objid = 1234
{ Error: Pool is closed.
    at Pool.getConnection (project\node\node_modules\mysql\lib\Pool.js:25:15)
    at Promise (project\node\utilities.js:506:8)
    at new Promise (<anonymous>)
    at executeQuery (project\node\utilities.js:505:9)
    at updateRecord (project\node\utilities.js:479:9)
    at processRecord (project\node\utilities.js:443:12)
    at checkRecord.then (project\node\utilities.js:301:12)
    at process._tickCallback (internal/process/next_tick.js:68:7) code: 'POOL_CLOSED' }
pool released.
pool released.

Following your advice I adjusted my code:

checkRecord(data)
.then((row) =>{
//console.log(JSON.stringify(row));
return processRecord(row[0], row[1], input);
// for (var i=0; i < database_changes.length; i++){
// console.log(database_changes.join("','"));
// }
})
.then(()=>{
pool.end((err)=>{
if (err) throw err;
console.log("pool released.");
});

Ken Ingram

unread,
Dec 12, 2019, 11:36:28 PM12/12/19
to node-mysql
I think I see the problem.
I'm still not very sophisticated with async programming, and in particular the idiosyncracies of Promises.

The code works fine with connection.release and processes the DML properly.
I have to manually kill the script because the pool is still open.

I need to Promise.all() the overarching script because each call to the script
(it's looping through an array of input data) is itself returning a promise.
If I end the connection pool any time before the last DML call to the database
completes, the remaining promises will reject due to no connection.

I just don't know how to organize this as a Promise.all resolve.

Ken Ingram

unread,
Dec 13, 2019, 4:16:11 AM12/13/19
to node-mysql
Figured it out. Had nothing to do with node-mysql.
It was purely how I was returning and handling the promises.

The final story is that in the case of my overarching function, I
needed to use Promise.all to collect all of the promises and when they
were all fulfilled, or one was rejected, handle pool.end();
Reply all
Reply to author
Forward
0 new messages