Yes, $unset works, but in that case I can't update in batches, since a
find by null will also find the documents that I just used $unset on.
I've eliminated the null-value fields now with a huge update anyway,
just to get rid of them. I'd still like to get to the bottom of the
horrible {$type : 10}-performance though.
After I fixed the null-value fields I tried a find with { $type : 10 }
again, just to find out how long it takes.
Here's the resulting explain (and this is without any null values
present in the collection). I don't understand this at all since a
similar query on a new collection (with 12M documents) didnt behave
this way. The index has been rebuilt after I upgraded to 2.0
I also tried a {$type : 10}-query on another field in the same
collection... Lightning fast. So it's something specific to my array
"Flags"
> db.Visits.find({Flags : {$type : 10}}).limit(1).explain()
{
"cursor" : "BtreeCursor Flags_1_CompanyId_1_AccountId_1",
"nscanned" : 13320222,
"nscannedObjects" : 13320222,
"n" : 0,
"millis" : 16402010,
"nYields" : 88032,
"nChunkSkips" : 0,
"isMultiKey" : true,
"indexOnly" : false,
"indexBounds" : {
"Flags" : [
[
null,
null
]
],
"CompanyId" : [
[
{
"$minElement" : 1
},
{
"$maxElement" : 1
}
]
],
"AccountId" : [
[
{
"$minElement" : 1
},
{
"$maxElement" : 1
}
]
]
}
}
I really appreciate your help with this
/Anders