mongoexport query issue

2,181 views
Skip to first unread message

Radomir Wojcik

unread,
Mar 15, 2013, 5:07:19 PM3/15/13
to mongod...@googlegroups.com
Hey all,,
I can't seem to get this mongoexport to work w/ this query. Query works fine otherwise when using it in find();
 
 
mongoexport --db blog --collection stats --query '{ "date" : { "$gte" : ISODate("2011-01-01T00:00:00Z"), "$lt" : ISODate("2012-02-01T00:00:00Z") }}'
 
I get the following error:
 

connected to: 127.0.0.1
Fri Mar 15 17:06:56 Assertion: 10340:Failure parsing JSON string near: "date" : {
0xb02701 0xac9da9 0xac9f2c 0x7b929d 0x7b973e 0x5724d9 0x5619ac 0xac6314 0x554e92 0x7f48813efcdd 0x554d19
 mongoexport(_ZN5mongo15printStackTraceERSo+0x21) [0xb02701]
 mongoexport(_ZN5mongo11msgassertedEiPKc+0x99) [0xac9da9]
 mongoexport() [0xac9f2c]
 mongoexport(_ZN5mongo8fromjsonEPKcPi+0x56d) [0x7b929d]
 mongoexport(_ZN5mongo8fromjsonERKSs+0xe) [0x7b973e]
 mongoexport(_ZN5mongo5QueryC1ERKSs+0x9) [0x5724d9]
 mongoexport(_ZN6Export3runEv+0x7fc) [0x5619ac]
 mongoexport(_ZN5mongo4Tool4mainEiPPc+0x13c4) [0xac6314]
 mongoexport(main+0x32) [0x554e92]
 /lib64/libc.so.6(__libc_start_main+0xfd) [0x7f48813efcdd]
 mongoexport(__gxx_personality_v0+0x421) [0x554d19]
assertion: 10340 Failure parsing JSON string near: "date" : {

Ronald Stalder

unread,
Mar 15, 2013, 5:56:40 PM3/15/13
to mongod...@googlegroups.com
Hi

are you doing this on Linux? Try to escape the dollar signs : \$gte etc...

Cheers
Ronald

Jeff Lee

unread,
Mar 15, 2013, 6:51:54 PM3/15/13
to mongod...@googlegroups.com
The only way I ever found to do this was to pass in the ms since epoch:

e.g.

shard01:PRIMARY> db.foodle.find()
{ "_id" : ObjectId("5143a2503924ff36ff355335"), "id" : 1, "ts" : ISODate("2013-03-15T22:36:00.853Z") }
{ "_id" : ObjectId("5143a2553924ff36ff355336"), "id" : 2, "ts" : ISODate("2013-03-15T22:36:05.775Z") }
{ "_id" : ObjectId("5143a25b3924ff36ff355337"), "id" : 3, "ts" : ISODate("2013-03-15T22:36:11.789Z") }

$ mongoexport -version
mongoexport version 2.2.0

$ mongoexport -h localhost -d test -c foodle -q '{ts:{"$gte":new Date('`date -d "2013-03-15 22:36:05" "+%s000"`')}}'
connected to: localhost
{ "_id" : { "$oid" : "5143a2553924ff36ff355336" }, "id" : 2, "ts" : { "$date" : 1363386965775 } }
{ "_id" : { "$oid" : "5143a25b3924ff36ff355337" }, "id" : 3, "ts" : { "$date" : 1363386971789 } }
exported 2 records

Would love to hear of an easier way if someone knows of one.





--
--
You received this message because you are subscribed to the Google
Groups "mongodb-user" group.
To post to this group, send email to mongod...@googlegroups.com
To unsubscribe from this group, send email to
mongodb-user...@googlegroups.com
See also the IRC channel -- freenode.net#mongodb
 
---
You received this message because you are subscribed to the Google Groups "mongodb-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mongodb-user...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Radomir Wojcik

unread,
Mar 18, 2013, 12:10:41 PM3/18/13
to mongod...@googlegroups.com
Using the escape character unfortunately yields the same results.

Ronald Stalder

unread,
Mar 18, 2013, 12:14:13 PM3/18/13
to mongod...@googlegroups.com
have you tried Jeff's method ?

Em 18-03-2013 13:10, Radomir Wojcik escreveu:
> Using the escape character unfortunately yields the same results.
> --
> --

Radomir Wojcik

unread,
Mar 18, 2013, 12:15:41 PM3/18/13
to mongod...@googlegroups.com
This seems to work, not sure what you mean by "me" though. The only problem now is that the date is not a standard ISODate. Is there any way to convert it back while doing the export itself? I would like to avoid parsing it afterwards if possible.

Radomir Wojcik

unread,
Mar 18, 2013, 12:16:38 PM3/18/13
to mongod...@googlegroups.com
Yeah, it worked but the date is no longer ISO. Looks like a Unix time stamp, UTC type date..

Radomir Wojcik

unread,
Mar 18, 2013, 12:19:55 PM3/18/13
to mongod...@googlegroups.com
Using the following query , it says that there are too many positional options:

--query  '{date: {$gte: new ISODate('2013-03-15 22:36:05')}}'

Radomir Wojcik

unread,
Mar 18, 2013, 12:23:40 PM3/18/13
to mongod...@googlegroups.com
Yeah I need to get it out with ISODate. Maybe there is a better way (not using mongoexport) but some other way, like pymongo or a database utility that can export ISODate correctly.

Ronald Stalder

unread,
Mar 18, 2013, 12:25:18 PM3/18/13
to mongod...@googlegroups.com


Em segunda-feira, 18 de março de 2013 13h19min55s UTC-3, Radomir Wojcik escreveu:
Using the following query , it says that there are too many positional options:

--query  '{date: {$gte: new ISODate('2013-03-15 22:36:05')}}'

use single quotes on the outer, double quotes on the inner string 

Radomir Wojcik

unread,
Mar 18, 2013, 12:27:17 PM3/18/13
to mongod...@googlegroups.com
Tried that too, gives me the original error again.

Ronald Stalder

unread,
Mar 18, 2013, 1:04:40 PM3/18/13
to mongod...@googlegroups.com

Em 18-03-2013 13:15, Radomir Wojcik escreveu:
This seems to work, not sure what you mean by "me" though. The only problem now is that the date is not a standard ISODate. Is there any way to convert it back while doing the export itself? I would like to avoid parsing it afterwards if possible.
No, but they will be converted back to ISODate doing the mongoimport

Radomir Wojcik

unread,
Mar 18, 2013, 2:22:09 PM3/18/13
to mongod...@googlegroups.com
I just want to get the data our in a report format though.

Jeff Lee

unread,
Mar 18, 2013, 8:04:58 PM3/18/13
to mongod...@googlegroups.com
I don't think mongoexport has the ability to export datetimes as anything other than the ms since epoch  ( which is how the data is stored internally ).  If you want them represented differently you would need to do the conversion yourself regardless of whether or not you provide any queries to filter the data.

> db.foo.find()
{ "_id" : ObjectId("5147a39e45ce3121f9f98266"), "id" : 1, "ts" : ISODate("2013-03-18T23:30:38.470Z") }

$ mongoexport -h localhost -d test -c foo

{ "_id" : { "$oid" : "5147a39e45ce3121f9f98266" }, "id" : 1, "ts" : { "$date" : 1363649438470 } }

You can do something like this to convert the datetime ( haven't checked this thoroughly ):

$ mongoexport -h localhost -d test -c foo | perl -pe 'use POSIX qw(strftime); s/"\$date" : ([0-9]+) \}/"\"\$date\" :" . strftime("\"%Y-%m-%d %H:%M:%S\"", localtime($1\/1000)) . "\}"/ge'

{ "_id" : { "$oid" : "5147a39e45ce3121f9f98266" }, "id" : 1, "ts" : { "$date" :"2013-03-18 16:30:38"} }

You can also write a script to do the dump using the client language of your choice.  They should all understand the datetime type.

>>> import pymongo
>>> c=pymongo.Connection()
>>> c['test'].foo.find_one()['ts']
datetime.datetime(2013, 3, 18, 23, 30, 38, 470000)

Hope that helps.




Radomir Wojcik

unread,
Mar 19, 2013, 4:55:52 PM3/19/13
to mongod...@googlegroups.com
Thanks Jeff. I was able to convert it in excel.

I ran into a new problem with mongoexport, it will not pull out fields with "(" brackets inside of them.  So I guess I will have to write my own little script in python after all. See my other post. I got started but not sure how to convert the JSON like output to a csv format on the fly.

Radomir Wojcik

unread,
Mar 19, 2013, 5:15:11 PM3/19/13
to mongod...@googlegroups.com
I am actually using python

is there a pymongo script out there already that exports fields using find() and saves them into a csv? That would be great.

otherwise I will export them as JSON and find something that converts JSON to csv.
Reply all
Reply to author
Forward
0 new messages