Skip to main content
Sumo Logic

Collect Logs for Google Kubernetes Engine

This page has instructions for configuring Google Kubernetes Engine to send logs to Sumo.

Configure Google Kubernetes Engine to export logs to Stackdriver  

If you haven't already done so, set up Google Kubernetes Engine to export logs to Stackdriver. For more information, see Overview of Logs Export in GCP.

Set up a Google-validated HTTP source and Pub/Sub topic   

In this step, you set up an HTTP source in Sumo, register it with Google, and create a Pub/Sub topic to send data to the HTTP source. Follow the instructions in Google Cloud Platform Source.

When you create the HTTP Source, assign a Source Category of “GCP”.

Create export of Google Kubernetes Engine logs from Stackdriver

To send GCP logs to Sumo Logic, you must configure a Pub/Sub in GCP for each GCP project.

  1. In GCP, go to the Pub/Sub section from the left navigation pane.
    gcp1.png
  2. Under Pub/Sub, go to Topics. Click Create Topic.
    gcp2.png
  3. Name the topic and click Create.
    gcp3.png
  4. Click the newly created topic in the Topics pane, and select New subscription from the options menu.
    gcp4.png
  5. In the Create a subscription pane:
  6. Go to Logging and click Exports.
    gcp6.png
  7. Click Create Export.
    gcp7.png
  8. Select a GCP service to filter the logs. The recommended GCP service to create sinks for is "GKE Cluster", which sends the service’s logs to Sumo Logic. In the Edit Export window on the right: In the Edit Export window on the right:

    1. Enter a Sink Name. For example, "gce-vm-instance".
    2. Select "Cloud Pub/Sub" as the Sink Service.
    3. Set Sink Destination to the Pub/Sub topic you created in the Google Cloud Platform Source procedure. For example, "pub-sub-logs".
    4. Click Create Sink.
      gcp8.png

Query Sample

Created Resources Over Time

_sourceCategory=*gcp* logName resource "\"type\":\"gke_cluster\""
| parse regex "\"logName\":\"(?<log_name>[^\"]+)\""
| where log_name matches "projects/*/logs/events"
| json "message.data.resource.labels" as labels
| json field=labels "project_id", "cluster_name" as project, cluster
| timeslice 1h
| count as events by _timeslice, cluster, project
| transpose row _timeslice column cluster, project