Export GKE data to cloud storage using the cloud logging sink

271 views
Skip to first unread message

Hari Chaudhary

unread,
Feb 6, 2021, 11:48:12 AM2/6/21
to gce-discussion
Hi all,

I have created a sink to store the gke logs  to destination cloud storage. I can see json files getting created on cloud storage.

Does the logs of the gke in cloud storage get created in json format.? Can I use that json file to see older data like 3 months older. ?

The reason for creating the sink is to store gke logs  for 1 year. Is this the correct way I am using ?

Regards,
Hari  





Disclaimer: This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely for the use of the intended recipient(s). If you are not the intended recipient(s), please notify the sender by e-mail and delete the original message. Any misuse of this email is unlawful. Please note that any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the company. L&T Financial Services, has taken every reasonable precaution to minimize risks of virus transmitting through email, however L&T Financial Services disclaims all responsibility and liability (including errors, loss and negligence) as a result of any virus in this e-mail. We recommend you to carry out your own virus checks and take any required precautions before opening the e-mail or attachment. Messages sent to or from this e-mail address may be stored on the L&T Financial Services e-mail system and L&T Financial Services reserves the right to monitor and review the content of all messages sent to or from this e-mail address.

Fady (Google Cloud Platform)

unread,
Feb 8, 2021, 4:10:34 PM2/8/21
to gce-discussion

Interesting questions. I am answering them inline as follows:

  • Does the logs of the gke in cloud storage get created in json format.? Yes, this is intended behavior and per this document.

  • Can I use that json file to see older data like 3 months older. ? You can view the logs in json format as json files (called objects in GCS). Viewing the files is bound by how they are saved in the organization hierarchy as explained here. However, the logs are only viewable from the moment you create the sink and not before creating it. Meaning that logs before the creation of the sink do not get stored. But rather new ingested logs would be stored. This is also explained in this document where it mentions “ Caution: Since exporting happens for new log entries only, you cannot export log entries that Logging received before your sink was created.”

  • The reason for creating the sink is to store gke logs for 1 year. Is this the correct way I am using ? This is one way of achieving it. If you opted to use Google Cloud Storage buckets, by default you do not have a retention period (you can store indefinitely) unless you configure the GCS bucket this way (Retention policy) as per this document. Another way is using Cloud Logs buckets and not GCS buckets. This way you can use the console to view the ingested logs, and set a desired retention period up to 3650 days (almost 10 years without counting leap ones). I am attaching screenshots when using the console. Also you might verify the costs of storing in Cloud Logging Buckets versus storing in Google cloud storage buckets

As to generally explain the behavior, resources like GKE nodes and containers would export logs to the Cloud Logging API. GKE does not decide where the logs would be ingested but rather this is done on the Cloud Logging API. The cloud Logging router decides where to direct the logs and if they would be retained. More about the behavior is documented here and here. I hope the above helps.
Screenshot 2021-02-08 3.51.10 PM.png
Screenshot 2021-02-08 3.50.54 PM.png

Hari Chaudhary

unread,
Feb 8, 2021, 9:39:14 PM2/8/21
to Fady (Google Cloud Platform), gce-discussion
Thanks Fady for explaining things in an easy way.
So now I have the logs in json format. How will I use it to visualize the logs in normal format. Do I need to import this json files in bigquery to see logs in readable format or is there any other way I read the logs. ?

Regards,
Hari 



--
© 2018 Google Inc. 1600 Amphitheatre Parkway, Mountain View, CA 94043
 
Email preferences: You received this email because you signed up for the Google Compute Engine Discussion Google Group (gce-dis...@googlegroups.com) to participate in discussions with other members of the Google Compute Engine community and the Google Compute Engine Team.
---
You received this message because you are subscribed to the Google Groups "gce-discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gce-discussio...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/gce-discussion/442d3bef-b366-410c-986d-aa0bad6a2c28n%40googlegroups.com.


--
Kind Regards,
Hari  Chaudhary
Under: Linux | Cloud  | Infra
L&T Financial Services
L&T Finance Ltd., 7th Floor, Brindavan, Plot 177, CST Road, Kalina, Mumbai-400098
Please log a service request at https://servicedesk.ltfs.com for any query/information

Fady (Google Cloud Platform)

unread,
Feb 9, 2021, 11:48:48 PM2/9/21
to gce-discussion

BigQuery is one way to visualize logs. These documents might help [1] [2]. Though, this is done by streaming the logs directly using a sink and without importing from Cloud Storage. There are multiple solutions that you may consider like Splunk or Datadog. However, this is a matter of preference. You may verify this document for different scenarios.

From GCS, the easiest way could be downloading them as files and using an application that would view structured logging. Searching the internet I found these sites that might help [3] [4]  (We do not endorse but rather for clarification and example purposes). 


Reply all
Reply to author
Forward
0 new messages