Skip to main content
Sumo Logic

Collect Logs for Google Compute Engine

Instructions for configuring log collection for the Sumo Logic App for Google Compute Engine.

This page describes the Sumo pipeline for ingesting logs from Google Cloud Platform (GCP) services, and provides instructions for collecting logs from Google Compute Engine.

Collection process for GCP services

The key components in the collection process for GCP services are Google Logs Export, Google Cloud Pub/Sub, and Sumo’s Google Cloud Platform (GCP) source running on a hosted collector. 

The GCP service generates logs which are exported and published to a Google Pub/Sub topic through Stackdriver. You will then set up a Sumo Logic Google Cloud Platform source that subscribes to this topic and receives the exported log data.

GCP_Collection_Overview.png 

Configuring collection for GCP uses the following process: 

  1. Configure a GCP source on a hosted collector. You'll obtain the HTTP URL for the source, and then use Google Cloud Console to register the URL as a validated domain.  
  2. Create a topic in Google Pub/Sub and subscribe the GCP source URL to that topic.
  3. Create an export of GCP logs from Google Stackdriver Logging. Exporting involves writing a filter that selects the log entries you want to export, and choosing a Pub/Sub as the destination. The filter and destination are held in an object called a sink. 

See the following sections for configuration instructions.

Configure a Google Cloud Platform Source

The Google Cloud Platform (GCP) Source receives log data from Google Pub/Sub.

This Source will be a Google Pub/Sub-only Source, which means that it will only be usable for log data formatted as data coming from Google Pub/Sub.

  1. In Sumo Logic select Manage Data > Collection > Collection
  2. Select an existing Hosted Collector upon which to add the Source. If you don't already have a Collector you'd like to use, create one, using the instructions on Configure a Hosted Collector.
  3. Click Add Source next to the Hosted Collector and click Google Cloud Platform.
  4. Enter a Name to display for the Source. A Description is optional.
    google_cloud_platform_2022.png
  5. Source Host (Optional). The Source Host value is tagged to each log and stored in a searchable metadata field called _sourceHost. Avoid using spaces so you do not have to quote them in keyword search expressions. This can be a maximum of 128 characters.
  6. Source Category (Optional). The Source Category value is tagged to each log and stored in a searchable metadata field called _sourceCategory. See our Best Practices: Good Source Category, Bad Source Category. Avoid using spaces so you do not have to quote them in keyword search expressions. This can be a maximum of 1,024 characters.
  7. Fields. Click the +Add Field link to add custom log metadata Fields, then define the fields you want to associate. Each field needs a name (key) and value. Look for one of the following icons and act accordingly:
    • orange exclamation point.png If an orange triangle with an exclamation point is shown, use the option to automatically add or enable the nonexistent fields before proceeding to the next step. The orange icon indicates that the field doesn't exist, or is disabled, in the Fields table schema. If a field is sent to Sumo that does not exist in the Fields schema or is disabled it is ignored, known as dropped.
    • green check circle.png If a green circle with a checkmark is shown, the field exists and is already enabled in the Fields table schema. Proceed to the next step.
  8. Advanced Options for Logs.

    GCP advanced options Jan 12 22.png

    • Timestamp Parsing. This option is selected by default. If it's deselected, no timestamp information is parsed at all.
    • Time Zone. There are two options for Time Zone. You can use the time zone present in your log files, and then choose an option in case time zone information is missing from a log message. Or, you can have Sumo Logic completely disregard any time zone information present in logs by forcing a time zone. It's very important to have the proper time zone set, no matter which option you choose. If the time zone of logs can't be determined, Sumo Logic assigns logs UTC; if the rest of your logs are from another time zone your search results will be affected.
    • Timestamp Format. By default, Sumo Logic will automatically detect the timestamp format of your logs. However, you can manually specify a timestamp format for a Source. See Timestamps, Time Zones, Time Ranges, and Date Formats for more information.
  9. Processing Rules. Configure any desired filters, such as allowlist, denylist, hash, or mask, as described in Create a Processing Rule.
  10. When you are finished configuring the Source click Save.

Configure a Pub/Sub Topic for GCP

You need to configure a Pub/Sub Topic in GCP and add a subscription to the Source URL that belongs to the Sumo Logic Google Cloud Platform Source you created. Once you configure the Pub/Sub, you can export data from Google Logging to the Pub/Sub. For example, you can export Google App Engine logs, as described on Collect Logs for Google App Engine.

  1. Create a Pub/Sub Topic in GCP. See Google Cloud documentation for the latest configuration steps.
  2. Create a Pub/Sub subscription to the Source URL that belongs to the Sumo Logic Google Cloud Platform Source you created. See Google Cloud documentation for the latest configuration steps.
    • Use a Push Delivery Method to the Sumo Logic Source URL. To determine the URL, navigate to the Source on the Collection page in Sumo Logic and click Show URL
Limitations

Google limits the volume of data sent from a Topic. Our testing resulted in the following data limits:

Topics Megabytes per second Payload size
One 18 MBps (1.5 TB/day) 100 KB
One 6 MBps (0.5 TB/day) 2.5 KB

We recommend the following:

  • Shard messages across topics within the above data limits.
  • Ask GCP to increase the allowable capacity for the topic.

Create export of Google Compute Engine logs from Google Logging

In this step you export logs to the Pub/Sub topic you created in the previous step.

  1. Go to Logging and click Logs Router.

GCP_logging_1.png

  1. Click Create Sink. 

    GCP_logging_2.png
  2. Click the arrow to Filter by label or text and select Convert to advanced filter.

GCP_logging_3.png

  1. For resource_type, replace "<resource_variable>" with "gce_instance".

GCP_logging_4.png

  1. Select a GCP service to filter the logs. The recommended GCP service to create sinks for is "GCE VM Instance", which sends the service’s logs to Sumo Logic. In the Edit Export window on the right:

    1. Set the Sink Name. For example, "gce-vm-instance".
    2. Select "Cloud Pub/Sub" as the Sink Service.
    3. Set Sink Destination to the newly created Pub/Sub topic. For example, "pub-sub-logs".
    4. Click Create Sink.

      GCP_logging_5.png
  2. By default, GCP logs are stored within Stackdriver, but you can configure Stackdriver to exclude them as detailed here without affecting the export to Sumo Logic as outlined above. To understand how to exclude Stackdriver logs, please follow the instructions in this GCP document.

Sample Log Message

{
 "message":{
   "data":{
     "insertId":"55E9891F381C2.A6AC1EA.F3043722",
     "logName":"projects/wk-dev/logs/cloudaudit.googleapis.com%2Factivity",
     "operation":{
       "first":true,
       "id":"operation-1511384259910-55e9891ee5970-33fdc63d-4bee6b10",
       "producer":"compute.googleapis.com"
     },
     "protoPayload":{
       "@type":"type.googleapis.com/google.cloud.audit.AuditLog",
       "authenticationInfo":{
         "principalEmail":"service-287993422434@dataflow-service-producer-prod.iam.gserviceaccount.com"
       },
       "authorizationInfo":[{
         "granted":true,
         "permission":"compute.instances.delete"
       }],
       "methodName":"v1.compute.instances.delete",
       "requestMetadata":{
         "callerSuppliedUserAgent":"Managed Infrastructure Mixer Client"
       },
       "resourceName":"projects/287993422434/zones/us-central1-f/instances/permissionlogs-yuanwang-1-11221246-d0b6-harness-p548",
       "response":{
         "@type":"compute.googleapis.com/operation",
         "id":"6917821783428586027",
         "insertTime":"2017-11-22T12:57:40.084-08:00",
         "name":"operation-1511384259910-55e9891ee5970-33fdc63d-4bee6b10",
         "operationType":"delete",
         "progress":"0",
         "selfLink":"https://www.googleapis.com/compute/v1/projects/wk-dev/zones/us-central1-f/operations/operation-1511384259910-55e9891ee5970-33fdc63d-4bee6b10",
         "status":"PENDING",
         "targetId":"7642006033207418043",
         "targetLink":"https://www.googleapis.com/compute/v1/projects/wk-dev/zones/us-central1-f/instances/permissionlogs-yuanwang-1-11221246-d0b6-harness-p548",
         "zone":"https://www.googleapis.com/compute/v1/projects/wk-dev/zones/us-central1-f"
       },
       "serviceName":"compute.googleapis.com"
     },
     "receiveTimestamp":"2017-11-22T20:57:41.0202444Z",
     "resource":{
       "labels":{
         "instance_id":"7642006033207418043",
         "project_id":"wk-dev",
         "zone":"us-central1-f"
       },
       "type":"gce_instance"
     },
     "severity":"NOTICE",
     "timestamp":"2017-11-22T20:57:39.896Z"
   },
   "attributes":{
     "logging.googleapis.com/timestamp":"2017-11-22T20:57:39.896Z"
   },
   "message_id":"174545382671298",
   "messageId":"174545382671298",
   "publish_time":"2017-11-22T20:57:42.118Z",
   "publishTime":"2017-11-22T20:57:42.118Z"
 },
 "subscription":"projects/wk-dev/subscriptions/sumo-test"
}

Query Sample

Top 10 users

_collector="HTTP Source for GCP Pub/Sub" logName resource timestamp
| json "message.data.resource.type" as type 
| parse regex "\s+\"logName\":\"(?<log_name>\S+)\"" 
| where type = "gce_instance" and log_name matches "projects/*/logs/cloudaudit.googleapis.com%2Factivity"
| parse regex "\s+\"resourceName\":\"projects/\S+/zones/(?<zone>\S+)/instances/(?<instance>\S+)\""
| json "message.data.resource.labels" as labels
| json field=labels "project_id" as project
| json "message.data.protoPayload.authenticationInfo.principalEmail" as user
| count as requests by user
| sort by requests
| limit 10