Skip to main content
Sumo Logic

Collect Logs for the Google Cloud IAM App

This page describes the Sumo pipeline for ingesting logs from Google Cloud Platform (GCP) services, and provides instructions for configuring log collection for the Google Cloud IAM App.

This page describes the Sumo pipeline for ingesting logs from Google Cloud Platform (GCP) services, and provides instructions for configuring log collection for the Google Cloud IAM App.

Collection process for GCP services

The key components in the collection process for GCP services are:  Google Logs Export, Google Cloud Pub/Sub and Sumo’s Google Cloud Platform (GCP) source running on a hosted collector. 

The integration works like this: The GCP service generates logs which are exported and published to a Google Pub/Sub topic via Stackdriver. You will then setup a Sumo Logic Google Cloud Platform source that subscribes to this topic and receives the exported log data.

GCP_Collection_Overview.png 

Configuring collection for GCP uses the following process: 

  1. Configure a GCP source on a hosted collector. You'll obtain the HTTP URL for the source, and then use Google Cloud Console to register the URL as a validated domain.  
  2. Create a topic in Google Pub/Sub and subscribe the GCP source URL to that topic.
  3. Create an export of GCP logs from Google Stackdriver Logging. Exporting involves writing a filter that selects the log entries you want to export, and choosing a Pub/Sub as the destination. The filter and destination are held in an object called a sink. 

See the following sections for configuration instructions.

Configure a Google Cloud Platform Source

The Google Cloud Platform (GCP) Source receives log data from Google Pub/Sub.

In this section, you add the Source URL as an allowed domain to your GCP account. This Source will be a Google Pub/Sub-only Source, which means that it will only be usable for log data formatted as data coming from Google Pub/Sub.

  1. In Sumo Logic select Manage Data > Collection > Collection
  2. Select an existing Hosted Collector upon which to add the Source. If you don't already have a Collector you'd like to use, create one, using the instructions on Configure a Hosted Collector.
  3. Click Add Source next to the Hosted Collector and click Google Cloud Platform.
  4. Enter a Name to display for the source in the Sumo web application. Description is optional.
  5. (Optional) For Source Host and Source Category, enter any string to tag the output collected from the source. Category metadata is stored in a searchable field called _sourceCategory, for example "gcp".
  6. Fields. Click the +Add Field link to add custom log metadata Fields.
    • Define the fields you want to associate, each field needs a name (key) and value. 
      • green check circle.png A green circle with a check mark is shown when the field exists and is enabled in the Fields table schema.
      • orange exclamation point.png An orange triangle with an exclamation point is shown when the field doesn't exist, or is disabled, in the Fields table schema. In this case, an option to automatically add or enable the nonexistent fields to the Fields table schema is provided. If a field is sent to Sumo that does not exist in the Fields schema or is disabled it is ignored, known as dropped.
  7. Log File Discovery. To set up the subscription you need to get an endpoint URL from Sumo to provide to Google. Click on Copy URL and use the provided URL as an allowed domain in GCP.
    log file discovery google platform.png Steps to add the Source's URL as an allowed domain in GCP:
    1. Open your Google Cloud Console.
    2. Select Products and services > APIs & Services > Credentials.
    3. Select Domain Verification > Add Domain.
    4. In the Configure webhook notifications for … dialog,  add the Source URL as valid domain and click Add Domain.
      DomainVerification
    5. Click Take Me There to verify ownership of the URL at Google’s webmaster central page. You are taken to the Google’s Webmaster Central interface to verify the URL.  
      Verify
    6. Click Add Property in the Webmaster Central site and add the Source URL.
      WebmasterCentral
    7. In Google, shown in the following screenshot, you only need to complete step 1, Download and open the HTML verification file. Complete the next steps, 8-11, in Sumo Logic before clicking Verify. You will verify in step 12.
      verification from webmaster central.png
  8. Return to the Source configuration page in Sumo under Log File Discovery. 
    • For Verification File Name, enter the verification file name.
    • For Verification File Contents, copy and paste the full string from the verification file's body contents.
      verification file input in sumo.png
      Continue configuring the Source in Sumo.
  9. Advanced Options for Logs
    • Enable Timestamp Parsing. This option is selected by default. If it's deselected, no timestamp information is parsed at all.
    • Time Zone. There are two options for Time Zone. You can use the time zone present in your log files, and then choose an option in case time zone information is missing from a log message. Or, you can have Sumo Logic completely disregard any time zone information present in logs by forcing a time zone. It's very important to have the proper time zone set, no matter which option you choose. If the time zone of logs can't be determined, Sumo Logic assigns logs UTC; if the rest of your logs are from another time zone your search results will be affected.
    • Timestamp Format. By default, Sumo Logic will automatically detect the timestamp format of your logs. However, you can manually specify a timestamp format for a Source. See Timestamps, Time Zones, Time Ranges, and Date Formats for more information.
  10. Processing Rules for Logs. Configure desired filters—such as include, exclude, hash, or mask—as described in Create a Processing Rule. Processing rules are applied to log data, but not to metric data. Note that while the Sumo service will receive your data, data ingestion will be performed in accordance with the regular expressions you specify in processing rules.
  11. When you are finished configuring the Source click Save.
  12. Return to Google's Webmaster Central and click Verify. It should verify successfully.
  13. Finally, in your GCP console return to Products and services > APIs & Services > Domain verification and add the Sumo Logic endpoint URL that you just verified to have the endpoint show up as a verified domain in GCP.

Configure a Pub/Sub topic for GCP projects

In this step, you configure a Pub/Sub topic in GCP and add a subscription to the above source URL. Once you configure the Pub/Sub, you can export data from Google Logging to the Pub/Sub. For example, you can export Google App Engine logs, as described on Collect Logs for Google App Engine.

Limitations

Google limits the volume of data sent from a Topic. Our testing resulted in the following data limits:

Topics Megabytes per second Payload size
One 18 MBps (1.5 TB/day) 100 KB
One 6 MBps (0.5 TB/day) 2.5 KB

We recommend the following:

  • Shard messages across topics within the above data limits.
  • Ask GCP to increase the allowable capacity for the topic.
Steps to create a Topic
  1. In GCP, select Pub/Sub in the left navigation pane.
    gcp1.png
  2. In the Pub/Sub pane, select Topics, then click Create Topic in the Topics pane.
    gcp2.png
  3. Name the topic and click Create.
    gcp3.png

  4. Select the new topic in the Topics pane, and select New subscription from the options menu.
    gcp4.png

  5. In the Create a subscription pane:
    1. Subscription Name. Enter a name for the subscription.
    2. Delivery Type. Choose “Push into an endpoint url”, and enter the upload URL for the Sumo HTTP source you created above.
    3. Click Create.
      gcp5.png

Create export of Cloud IAM logs from Google Logging

In this step you export logs to the Pub/Sub topic you created in the previous step.

  1. Click Logging in the left hand pane of the GCP console.

    cloud-iam-7.png
  2. Go to Exports. Click Create Export.
    cloud-iam-8.png
  3. Create a sink for each GCP service whose logs you want to send to Sumo. We recommend you create sinks for the following services:  Google Project, IAM Role, and Service Account. To configure a sink:

    1. Select the service in the middle pane (Google Project, IAM Role, or Service Account).

    2. In the Edit Export window on the right:

      • Set the Sink Name. For example, "google-project".
      • Set Sink Service to “Cloud Pub/Sub”
      • Set Sink Destination to the newly created Pub/Sub topic. For example, "pub-sub-logs".
      • Click Create Sink.cloud-iam-9.png
  4. By default, GCP logs are stored within Stackdriver, but you can configure Stackdriver to exclude them as detailed here without affecting the export to Sumo Logic as outlined above. To understand how to exclude Stackdriver logs, please follow the instructions in this GCP document.

Sample Log Message

{
 "message":{
   "data":{
     "insertId":"1b6mckoca48",
     "logName":"projects/bmlabs-loggen/logs/cloudaudit.googleapis.com%2Factivity",
     "protoPayload":{
       "@type":"type.googleapis.com/google.cloud.audit.AuditLog",
       "authenticationInfo":{
         "principalEmail":"player1@bmlabs.com"
       },
       "authorizationInfo":[{
         "granted":true,
         "permission":"iam.roles.undelete",
         "resource":"projects/bmlabs-loggen/roles/CustomRole655"
       }],
       "methodName":"google.iam.admin.v1.UndeleteRole",
       "request":{
         "@type":"type.googleapis.com/google.iam.admin.v1.UndeleteRoleRequest",
         "name":"projects/bmlabs-loggen/roles/CustomRole655"
       },
       "requestMetadata":{
         "callerIp":"73.110.42.127",
         "callerSuppliedUserAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36,gzip(gfe)"
       },
       "resourceName":"projects/bmlabs-loggen/roles/CustomRole655",
       "response":{
         "@type":"type.googleapis.com/google.iam.admin.v1.Role",
         "description":"Created on: 2017-10-24",
         "etag":"BwVcY076Hf0=",
         "group_name":"custom",
         "group_title":"Custom",
         "included_permissions":["bigquery.datasets.create"],
         "name":"projects/bmlabs-loggen/roles/CustomRole655",
         "title":"Custom Role  3"
       },
       "serviceName":"iam.googleapis.com",
       "status":{
         
       }
     },
     "receiveTimestamp":"2017-11-20T10:54:01.590EST",
     "resource":{
       "labels":{
         "project_id":"bmlabs-loggen",
         "role_name":"projects/bmlabs-loggen/roles/CustomRole655"
       },
       "type":"iam_role"
     },
     "severity":"NOTICE",
     "timestamp":"2017-11-20T10:54:01.590EST"
   },
   "attributes":{
     "logging.googleapis.com/timestamp":"2017-11-20T10:54:01.590EST"
   },
   "message_id":"164347792499667",
   "messageId":"164347792499667",
   "publish_time":"2017-11-20T10:54:01.590EST",
   "publishTime":"2017-11-20T10:54:01.590EST"
 },
 "subscription":"projects/bmlabs-loggen/subscriptions/push-to-sumo"
}

Query Sample

Added roles over time

_collector="HTTP Source for GCP Pub/Sub" logName resource timestamp
| json "message.data.resource.type" as type
| parse regex "\s+\"logName\":\"(?<log_name>\S+)\""
| where type = "project" and log_name matches "projects/*/logs/cloudaudit.googleapis.com%2Factivity"
| timeslice 1h
| json "message.data.resource.labels", "message.data.resource.labels.project_id", "message.data.protoPayload.serviceData.policyDelta.bindingDeltas[*]" as labels, project, changes
| parse regex field=changes "\"role\":\"roles\\\/(?<role>[a-zA-Z.]+)\",\"member\":\".*\",\"action\":\"(?<action>[A-Z]+)\"" multi
| where action="ADD"
| count by _timeslice, role
| transpose row _timeslice column role