What's the easiest way to stream logs from GAE, GKE, and GCE to ELK?

508 views
Skip to first unread message

Joshua Fox

unread,
Oct 22, 2017, 4:51:49 AM10/22/17
to google-a...@googlegroups.com, Google Stackdriver Discussion Forum
I'd like to use ELK, SumoLogic , or Splunk to manage logs (metrics, search, filters, charts, dashboards, notifications) instead of Log Viewer or StackDriver.

What is the easiest way to stream logs to one of these log management tools?  

Kenworth (Google Cloud Platform)

unread,
Oct 22, 2017, 5:42:26 PM10/22/17
to Google App Engine
These 3rd-party softwares have different features and different architectures. If you really want to go down this path, the easiest way is to study the 3rd-party software and how it is integrated with GAE, GKE, and GCE. The very idea of using these softwares on the first place is that they manage the streaming of your logs. Based on a quick search, some of them even have sample GAE projects on GitHub. 

I would also recommend picking the features missing from your current Log Viewer or Stackdriver, if any, and submit a feature request to Google Cloud Platform as described in this article.

Evan Jones

unread,
Oct 23, 2017, 9:11:06 AM10/23/17
to Google App Engine
My suggestion is to use the Stackdriver log export features to export to PubSub, Google Cloud Storage, or BigQuery, then write some appropriate piece of code to send that to the destination of your choice. This will have the nice advantage that it will work for *all* logs for all Google Cloud products. The alternative is to find the appropriate product and language specific library and add it to your App Engine application. However, you will need to repeat this work for other languages and if you want to run the code somewhere else.

I'm not familiar with the specific APIs for any of these products, but I suspect it should be fairly easy to write something that pulls logs from a PubSub queue and sends them to the appropriate destination, if these vendors don't provide something already.

Good luck!

Evan

Joshua Fox

unread,
Oct 23, 2017, 10:19:02 AM10/23/17
to google-a...@googlegroups.com
Evan, Thank you. 

On Mon, Oct 23, 2017 at 4:11 PM, Evan Jones <evan....@bluecore.com> wrote:
My suggestion is to use the Stackdriver log export features to export to PubSub, Google Cloud Storage, or BigQuery

Would that give a bulk export or a streaming export to Splunk/Sumo/ELK? 
, then write some appropriate piece of code to send that to the destination of your choice. This will have the nice advantage that it will work for *all* logs for all Google Cloud products. The alternative is to find the appropriate product and language specific library and add it to your App Engine application. However, you will need to repeat this work for other languages and if you want to run the code somewhere else.

I'm not familiar with the specific APIs for any of these products, but I suspect it should be fairly easy to write something that pulls logs from a PubSub queue and sends them to the appropriate destination, if these vendors don't provide something already.


But it seems strange to have to write any code. It seems that Splunk/Sumo/ELK should have some builtin log ingestors requiring nothing more than a configuration in GCP or a provided log4j Handler.
 
Good luck!

Evan




On Sunday, October 22, 2017 at 4:51:49 AM UTC-4, Joshua Fox wrote:
I'd like to use ELK, SumoLogic , or Splunk to manage logs (metrics, search, filters, charts, dashboards, notifications) instead of Log Viewer or StackDriver.

What is the easiest way to stream logs to one of these log management tools?  

--
You received this message because you are subscribed to the Google Groups "Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-appengine+unsubscribe@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-appengine/efb4b5f8-01c5-4964-a070-92d7a83b2857%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--
 

JOSHUA FOX
Principal Software Architect | Freightos



T (Israel): +972-545691165 | T (US):  +1-3123400953  
Smooth shipping.



Evan Jones

unread,
Oct 23, 2017, 11:03:34 AM10/23/17
to Google App Engine
On Mon, Oct 23, 2017 at 10:17 AM, Joshua Fox <jos...@freightos.com> wrote:
But it seems strange to have to write any code. It seems that Splunk/Sumo/ELK should have some builtin log ingestors requiring nothing more than a configuration in GCP or a provided log4j Handler.

I agree! I don't actually know these vendors products very well, but I suspect they must have something? I do have experience writing things that consume the Stackdriver logging export (I absolutely love the BigQuery export, and have experimented with the PubSub log export, although we didn't end up using it "in production"). The reason I suggest doing this, which could be more work, is that we have found that a number of the other stackdriver logs have been useful. If you want all your logs in one place, and you want the HTTP load balancer logs, container engine logs, or other things, then you may end up needing to figure this out at some point anyway.

Good luck!

Evan

Reply all
Reply to author
Forward
0 new messages