Hi, Asya,
Thanks for reply.
1) First, how can I do that? can you provide example of doing 'project' to convert to local on my earlier group example?
2) Even though it works, it will work on 'hour' level. But, for day, month, grouping is already done on day. In UTC, it may be 'two days', but, locally, it may be just 'one' day. If you just group by day or month (data on 31st and next day cross over), how can you convert to local time without knowing hour unless you always group by 'hour' then add...? I think doing in hour is expensive.
3) Just today, I started getting 'my doc is over the max doc size (16 MB) error' on hourly aggregation. Daily works fine. But, I think this is strange. Since actual result set would be 5 days worth of hourly aggregations. Which is like 5 days times 24 hours and 'count' result for each hour. So, how can this be over 16 MB? I am sharded environment. So, the error comes from 'mongos'. Currently, i have about 1.6 million record sets for past 5 days.
4) I tried adding separate 'ltime' field that contains like "ltime.year, ltime.month, ltime.day,
ltime.hr, ltime.min". But, when I checked the db.stats(), averageObjSize grew 100%!. Before adding new field and updating all record sets, average size was 727 bytes, and now, it's 1557 bytes from db.stats(). So, now physical file size has increased significantly, and free memory reduced significantly. I think soon the system will run out of memory. Can anyone suggest me what's going on?
Thanks.