Closing Mongo DB connections and Cursors

2,223 views
Skip to first unread message

siddu

unread,
Nov 4, 2011, 3:20:50 AM11/4/11
to mongodb-user
Dear All..

I am facing some problems with MongoDB.


1. I am using MongoDB for Autosuggestion Generating. So for every key
typing, it will open a MongoDB connection and I am closing the same
after getting result. But this is some what time consuming as i think
for every connection opening and closing. So what is the best possible
way of doing this? I heard about Persistant connections.

What is exactly persistant connection means.. In my case how many
persistant connections should I open? How to do that?



2. Sometimes, the memory usage is going to 100 to 400 % for MongoDB
itself and server gets hanged. After searching over net, I found that
this is because of non closing the CURSORS. So, how to close cursor
after getting result?

Please check the below MongoDB server status

"mem" : {
"bits" : 64,
"resident" : 1500,
"virtual" : 8890,
"supported" : true,
"mapped" : 4047,
"mappedWithJournal" : 8094
},
"connections" : {
"current" : 58,
"available" : 9542
},
"extra_info" : {
"note" : "fields vary by platform",
"heap_usage_bytes" : 1026240,
"page_faults" : 0
},
"indexCounters" : {
"btree" : {
"accesses" : 1,
"hits" : 1,
"misses" : 0,
"resets" : 0,
"missRatio" : 0
}
},
"backgroundFlushing" : {
"flushes" : 38,
"total_ms" : 10,
"average_ms" : 0.2631578947368421,
"last_ms" : 0,
"last_finished" : ISODate("2011-11-04T07:32:17Z")
},
"cursors" : {
"totalOpen" : 170,
"clientCursors_size" : 170,
"timedOut" : 549
},
"network" : {
"bytesIn" : 1781349,
"bytesOut" : 5348882,
"numRequests" : 16209
},
"opcounters" : {
"insert" : 2,
"query" : 4004,
"update" : 0,
"delete" : 0,
"getmore" : 0,
"command" : 12254
},


Please help me in this regard.

Karl Seguin

unread,
Nov 4, 2011, 3:41:15 AM11/4/11
to mongod...@googlegroups.com
What driver are you using? Connection pooling and cursor management should be taken care of for you by the driver.

Karl

siddu

unread,
Nov 4, 2011, 3:46:24 AM11/4/11
to mongodb-user
@Karl..

I am using PHP driver version 1.2.4

Karl Seguin

unread,
Nov 4, 2011, 10:15:21 AM11/4/11
to mongod...@googlegroups.com
As of 1.2, the PHP driver does pooling, so your first issue shouldn't be a problem. If you goto http://php.net/manual/en/mongo.connecting.php and search for  "Connection Pooling" it should provide some insight, though there's nothing special you need to do.

As for potential leaking cursors...again, the driver should take care of closing them. Are you possibly doing a find() and NOT iterating through the entire cursor? That could be a problem. 

If you are fully "consuming" your cursors, you might want to open a JIRA against the PHP driver:

Make sure to include version (of the driver, of php, 32/64 bits), general environment (apache?) and a sample use-case that shows how you are issuing commands. It would help if you could isolate what exact code is resulting in a cursor timing out on the server (which might be a bit of a pain in the ass since it takes 10 minutes for them to time out server side and I'm not aware of a configuration variable to shorten it).

I wonder if  db.currentOp() would show idle cursors..that might make your life really easy if it does.

Karl

siddu

unread,
Nov 7, 2011, 1:55:58 AM11/7/11
to mongodb-user
@Karl..

I tried Connection Pooling .. I tried to open 100 connections using
for loop. But is not opening the 100 connection. it is just opening
only one connection.

//Are you possibly doing a find() and NOT iterating through the
//entire cursor? That could be a problem.

I am fetching 5 records every time. and i am iterating through the
limit. But the cursors are not closing.
Please check my below code.. help me..

try{

include_once ('con.php');

$regex = new MongoRegex("/^".$q."/");
$limit=5;
$cursor = $collection->find(array('nm' => $regex))-
>sort(array('p'=>1))->limit($limit);
if($cursor->count()==0){
$qArr=explode(" ",$q);
$RegexArr = array ();
for($i=0;$i<sizeof($qArr);$i++){
$RegexArr[$i]=new MongoRegex("/^".$qArr[$i]."/");
}
$cursor = $collection->find(array('kw' => array('$all'=>
$RegexArr)))->limit($limit);
}
if($cursor->count()==0){
$regex = new MongoRegex("/^".strtoupper($q)."/");
$cursor = $collection->find(array('elc' => $regex))-
>sort(array('p'=>4))->limit($limit);
}

if($cursor->count()==0){$data[]='[]';}
foreach ($cursor as $obj) {
$json = array();
//-- some code here for iterating fileds...
}
// unset($cursor);
$m->close();

} catch ( MongoConnectionException $e ){
echo 'Connection Error !'.$e;
exit();

Karl Seguin

unread,
Nov 7, 2011, 6:42:06 AM11/7/11
to mongod...@googlegroups.com
Humm..

First you can look at http://www.php.net/manual/en/mongopool.getsize.php   it shows how to get pool information and set the pool size. It makes sense that if you open 100 connection that it DOES NOT actually use 100 connections. That's the whole point of the pool, to re-use already-opened connection so you don't need 100.

As for your other problem...I don't know enough about PHP, the only thing that's slightly suspicious is that you re-assign new cursors to the $cursor variable. I would expect PHP to track the unreferenced cursors and clean them up after the script executes. Although, there's no guarantee of WHEN PHP will run that process (if it's a normal GC, it probably won't do it until there's some memory pressure, and it'll be completely unaware of MongoDB needs). This is why languages and libraries should provide some means to do deterministic finalization.

Anyways, this seems like a bug in the PHP driver to me..though I'm not positive. It's worth opening up a jira with your code example and your  server status output. If you provide the URL to the JIRA here, I'm happy to add a comment with my thoughts on it.

Karl

Kristina Chodorow

unread,
Nov 9, 2011, 4:31:48 PM11/9/11
to mongodb-user
Also, can you do some measurements to see what is taking the most
time? You could try something like xdebug (http://devzone.zend.com/
article/2899-Profiling-PHP-Applications-With-xdebug) to figure out
what is taking up time, then report back your findings.


On Nov 7, 6:42 am, Karl Seguin <karlseg...@gmail.com> wrote:
> Humm..
>
> First you can look athttp://www.php.net/manual/en/mongopool.getsize.php

Karl Seguin

unread,
Nov 9, 2011, 8:43:44 PM11/9/11
to mongod...@googlegroups.com
Kristina, 
If PHP's GC works like the other popular ones (JVMs', CLRs, ...) isn't it something of a cursor leak to let the GC decide when to clean them up? If the GC only runs every 20 minutes, and cursors time out every 10 minutes, won't you see a high cursor timeout? Shouldn't there be explicit close method for cursors available?

Karl

Kristina Chodorow

unread,
Nov 10, 2011, 9:55:38 AM11/10/11
to mongodb-user
PHP uses reference counting, as soon as there are no more references
to something it'll be destroyed immediately. You can kill a cursor by
letting it go out of scope.
Reply all
Reply to author
Forward
0 new messages