Hi CCF Team,
Currently on our setup we are running CCF on our EC2 instance through Docker Compose, and we are also connected to MongoDB for persistent data storage/caching.
However, the issue is that the CUR data set we are trying to load in is quite large, and we are facing some trouble in having CCF to read the data from Athena and cache it into MongoDB, as we will always face these 504 Gateway Timeout errors:
We would have to manually reload the page again for CCF to attempt to read the data through Athena again, and after several attempts, the docker API container would crash / timeout:
Resulting in this 502 Bad Gateway error on the front page:
We have tried upgrading the EC2 instance to a larger / faster instance type, but still facing similar issues. Is there a way for us to overcome this gateway timeout issue so our data can successfully load?
Currently, we are truncating and slowly feeding CCF data month by month, since we have CUR Report data all the way back to 2021. After multiple tries, it eventually loads but we are wondering if there is a better way to do this?
Also, is it possible to increase the pagination limit on MongoDB through Docker compose? I understand that it is currently limited to 50,000 pages per API call, but since we have almost a million, it is taking some time for CCF to load and get the data from MongoDB. I have tried adding this line in the Docker Compose file:
But so far it still sticks to the same default 50,000 pagination limit. Would there be another way to configure this pagination limit through the Docker compose file?
Thank you for your time and hope to hear from the team soon!
Best Regards,
Mukmin