Save/Export GAE logs for more than 30 days in an easily readable way

77 views
Skip to first unread message

Khaled Wagdy

unread,
Sep 5, 2019, 11:55:50 AM9/5/19
to Google App Engine
Hi All,

Is there a best practice for keeping logs for more than 30 days yet in an easily readable format?

Recently we had an issue and wanted to check the logs of the previous month. Since the logs are only kept for 30 days in Log Viewer we couldn't find it there. We have previously setup a sink to BigQuery, but although the logs seems to be there, we couldn't find the logs we're looking for! Not sure if this is lack of experience with BigQuery or is it that only the request logs coming from actual users are the ones kept. We are looking for a particular monthly CRON job logs, which we couldn't find in the backed up logs on BigQuery for some reason.

We thought we would setup another sink to Cloud storage for the future, but it seems to be saved in JSON format, which includes too much information that makes it hard to read in critical cases where we want to see normal log flow as it appears in Log Viewer. 

We used to use instructions in the Download logs page using the command appcfg.py request_logs to download logs periodically, but we stopped doing that a while ago. The problem with this method is that instead of showing the date and time as they appear on the Log Viewer page, they appear in what I assume to be an Epoch format which still makes it hard to read, example:
1:1565112559.564261 log line...
3:1565112559.564692 log line...
...
furthermore this method seems to be on its way to be deprecated as mentioned on the download logs page and the new gcloud tool does not have a download option

Warning: The following describes using the appcfg tool to download logs. This tool is now deprecated. Currently the replacement, Cloud SDK, does not support this download capability; however, you can view your logs using the gcloud app log commands.

 I am not sure why it is not easy to export log viewer as it is, it's really very convenient as is, we just a way to download it straight forwardly.

Questions:
  1. Knowing that we have a sink setup to BigQuery, is there a way to fetch the logs of the previous month CORN job? Was it exported as part of the sink? How can we filter through BigQuery to fetch those logs?
  2. Is there a way to export logs to keep for more than 30 days, without changing the log viewer format?

Anthony (Google Cloud Support)

unread,
Sep 9, 2019, 3:51:08 PM9/9/19
to Google App Engine
Hi Khaled,

To answer your questions

1. Your sink will hold all logs from the day it was initialized. Which means 31 days from now, it will hold the 31st day of the log rather than deleting the log from the 1st day. This also means that any logs prior to the export sink will not be retained and will be removed after 30 days.

2. In regards to your second inquiry, using export sink is the only way to retain logs over 30 days. That said, as per the following document [1] these are the current formats provided by Stackdriver.

I hope this helps.

Khaled Wagdy

unread,
Sep 9, 2019, 4:07:58 PM9/9/19
to Google App Engine
Hello Anthony,

Thank you for your feedback, I am not quite sure I understood your first point. When you say the sink will hold the 31st day, you mean it will not log the 31st day, right? In our case I think we have setup the sink to export the logs daily, since in BigQuery we can choose by day in the drop down menu so I think it's good for now. The query we need to set in BigQuery is a bit cumbersome since there's a lot of fields and it's confusing a bit.

For the second point, all we ask for a is an easily readable format just like the Log Viewer. Date, event, collected information of each request, that's it. The closest format we saw to that was the appcfg download logs, which, as we mentioned before has this epoch format that is challenging to read. We would settle for the download logs ability and we can setup a cron job to download logs to a cloud storage bucket for future reference. But, also as we mentioned, for some reason this is on its way to be deprecated. Would you be able to relay a feature request for gcloud SDK team for the gcloud SDK to be able to download logs like appcfg? This time, preferably in UTC time rather than epoch time. That would be very much appreciated.

Thank you,

Vivak Patel

unread,
Sep 11, 2019, 2:44:32 PM9/11/19
to Google App Engine

Hi Khaled


I believe what Anthony meant to say was that exporting logs using sinks will help mitigate the loss of logs on Log Viewer past the 30 day mark. However, logs prior to the creation of the sink will still be lost.


As for the feature request, we always appreciate users collaborating with our engineers on ideas to improve the product. This discussion group does not allow the moderation of feature requests, which is why I encourage you to create the feature request on Public Issue Tracker[1] as it will provide you visibility on the request.


[1] - https://issuetracker.google.com/


Khaled Wagdy

unread,
Sep 12, 2019, 1:05:14 PM9/12/19
to Google App Engine
Thank you Vivak,

I have followed your advise and posted on issuetracker. They pointed me to the right direction, I thought I would share it here in case someone else would like to download the logs.

The command to use is: $ gcloud logging read. I have set the severity, timestamp, format and chosen the fields I believe would be useful in tracing. I also channeled the output to a file. One should adjust the parameters according to their requirements.

$ gcloud logging read "severity>=DEBUG AND timestamp>=\"2019-09-12T00:00:00Z\"" --format='table[no-heading](timestamp, protoPayload.ip, protoPayload.method, protoPayload.resource, httpRequest.status, protoPayload.userAgent ,protoPayload.line[]:format="value[separator=\"    \"](severity, logMessage)")' >> 20190912.txt

Resources:
Reply all
Reply to author
Forward
0 new messages