Skip to main content
Sumo Logic

Collect Logs for Google Cloud Audit

This page has instructions for configuring Google Cloud Audit to send logs to Sumo.

Configure Cloud Audit to export logs to Stackdriver 

If you haven't already done so, set up Google Cloud Audit to export logs to Stackdriver. For more information, see Overview of Logs Export in GCP.

Set up a Google Cloud Platform source and Pub/Sub topic

In this step, you set up an Google Cloud Platform source in Sumo, register it with Google, and create a Pub/Sub topic to send data to the source source. Follow the instructions in Google Cloud Platform Source.

Create export of Cloud Audit logs from Stackdriver 

  1. Go to Logging in GCP.
  2. Go to Exports. Click Create Export.
  3. Add an advanced filter by clicking Convert to advanced filter under the dropdown.
  4. Create an advanced filter of logName: "logs/".
    For information about defining advanced filters, see Advanced Filters in GCP help.
    In the Edit Export window on the right:

    1. Set the Sink Name. For example, "gcp-all".
    2. Set Sink Service to “Cloud Pub/Sub”.
    3. Set Sink Destination to the newly created Pub/Sub topic. For example, "pub-sub-logs".
    4. Click Create Sink.

Sample Log Message

  "message": {
    "data": {
      "insertId": "55E06F0577741.AA05843.A90CA7B9",
      "logName": "projects/bmlabs-loggen/logs/",
      "operation": {
        "id": "operation-1510758777595-55e06f047a479-fd74bd40-dc6cfc9b",
        "last": true,
        "producer": ""
      "protoPayload": {
        "@type": "",
        "authenticationInfo": {
          "principalEmail": ""
        "methodName": "beta.compute.instanceTemplates.delete",
        "requestMetadata": {
          "callerIp": "",
          "callerSuppliedUserAgent": "cloud_workflow_service"
        "resourceName": "projects/bmlabs-loggen/global/instanceTemplates/dataflow-permissionlogs-johndoe-1-11150704-7cbb-harness",
        "serviceName": ""
      "receiveTimestamp": "2018-01-26T12:08:31.316UTC",
      "resource": {
        "labels": {
          "instance_template_id": "176548811930462611",
          "instance_template_name": "dataflow-permissionlogs-johndoe-1-11150704-7cbb-harness",
          "project_id": "bmlabs-loggen"
        "type": "gce_instance_template"
      "severity": "NOTICE",
      "timestamp": "2018-01-26T12:08:31.316UTC"
    "attributes": {
      "": "2018-01-26T12:08:31.316UTC"
    "message_id": "172054682231179",
    "messageId": "172054682231179",
    "publish_time": "2018-01-26T12:08:31.316UTC",
    "publishTime": "2018-01-26T12:08:31.316UTC"
  "subscription": "projects/bmlabs-loggen/subscriptions/sumo-test"

Query Sample

Recent firewall changes

_collector="HTTP Source for GCP Pub/Sub" logName methodName principalEmail request resource timestamp
| parse regex "\"logName\":\"(?<log_name>[^\"]+)\"" 
| where log_name matches "projects/*/logs/"
| json "" as data
| json field=data "resource.type" as type
| where type = "gce_firewall_rule"
| json field=data "timestamp", "resource.labels", "resource.labels.project_id", "protoPayload.authenticationInfo.principalEmail", "protoPayload.methodName", "protoPayload.request" as timestamp, labels, project, user, method, request
| json field=request "direction", "alloweds[*]", "denieds[*]" as direction, alloweds, denieds nodrop
| if(isNull(alloweds) OR alloweds="","deny","allow") as action
| parse "\"sourceRanges\":[*]" as ranges nodrop
| parse "\"destinationRanges\":[*]" as ranges
| parse regex field=alloweds "\"IPProtocol\":\"(?<protocol>[a-zA-Z\.]+)\"[,\"a-z:]*\[?(?<ports>[0-9-\",]+)?\]?" multi nodrop
| parse regex field=denieds "\"IPProtocol\":\"(?<protocol>[a-zA-Z\.]+)\"[,\"a-z:]*\[?(?<ports>[0-9-\",]+)?\]?" multi
| count as operations by timestamp, user, method, ranges, direction, action, protocol, ports
| fields timestamp, user, method, ranges, direction, action, protocol, ports
| sort by timestamp