Skip to main content
Sumo Logic

Collect Logs for Google Cloud Storage

This page has instructions for configuring Google Cloud Storage to send logs to Sumo.

Configure Google Cloud Storage to export logs to Stackdriver   

If you haven't already done so, set up Google Cloud Storage to export logs to Stackdriver. For more information, see Overview of Logs Export in GCP.

Set up a Google-validated HTTP source and Pub/Sub topic   

In this step, you set up an HTTP source in Sumo, register it with Google, and create a Pub/Sub topic to send data to the HTTP source. Follow the instructions in Google Cloud Platform Source.

When you create the HTTP Source, assign a Source Category of “GCP”.

  1. Go to Logging and click Exports.
    gcp6.png
  2. Click Create Export.
    gcp7.png
  3. Select a GCP service to filter the logs. The recommended GCP service to create sinks for is "GCS Bucket", which sends the service’s logs to Sumo Logic. In the Edit Export window on the right:

    1. Set the Sink Name. For example, "gce-vm-instance".
    2. Select "Cloud Pub/Sub" as the Sink Service.
    3. Set Sink Destination to the Pub/Sub topic you created in the Google Cloud Platform Source procedure. For example, "pub-sub-logs".
    4. Click Create Sink.
      gcp8.png

Query Sample

Created Resources Over Time

_sourceCategory=*gcp* data logName resource "\"type\":\"gcs_bucket\""
| parse regex "\"logName\":\"(?<log_name>[^\"]+)\"" 
| where log_name matches "projects/*/logs/cloudaudit.googleapis.com%2*"
| json "message.data.resource.labels", "message.data.protoPayload.methodName" as labels, method
| where method matches "*create" or method matches "*delete"
| json field=labels "project_id", "bucket_name", "location" as project, bucket_name, location
| timeslice 1h
| count as operations by _timeslice, method
| transpose row _timeslice column method
| fillmissing timeslice(1h)