Skip to main content
Sumo Logic

Collect Logs for Google Cloud VPC

Instructions for collection logs from Google Cloud VPC.

This page has instructions for configuring Google Cloud VPC to send logs to Sumo.

Configure Cloud VPC to export logs to Stackdriver 

If you haven't already done so, set up Google Cloud VPC to export logs to Stackdriver. For more information, see Overview of Logs Export in GCP.

Set up a Google Cloud Platform source and Pub/Sub topic

In this step, you set up an Google Cloud Platform source in Sumo, register it with Google, and create a Pub/Sub topic to send data to the source source. Follow the instructions in Google Cloud Platform Source.

Create export of Google Cloud VPC logs from Stackdriver

  1. Go to Logging in GCP.
  2. Go to Exports. Click Create Export.
  3. Select a "GCE Subnetwork" as the GCP service to filter the logs.

  4. In the Edit Export window on the right:

    1. Set the Sink Name. For example, "gcp-subnetwork."
    2. Set Sink Service to “Cloud Pub/Sub”
    3. Set Sink Destination to the newly created Pub/Sub topic. For example, pub-sub-logs.
    4. Click Create Sink.

Sample log message 

  "message": {
    "data": {
      "insertId": "h7cue3dc1fr",
      "jsonPayload": {
        "bytes_sent": "1836",
        "connection": {
          "dest_ip": "",
          "dest_port": 443,
          "protocol": 6,
          "src_ip": "",
          "src_port": 56552
        "dest_location": {
          "city": "Ashburn",
          "continent": "America",
          "country": "usa",
          "region": "Virginia"
        "end_time": "2018-01-26T12:35:10.115UTC",
        "packets_sent": "20",
        "reporter": "SRC",
        "rtt_msec": "49",
        "src_instance": {
          "project_id": "bmlabs-loggen",
          "region": "us-central1",
          "vm_name": "vm-selectstar-collector-again",
          "zone": "us-central1-c"
        "src_vpc": {
          "project_id": "bmlabs-loggen",
          "subnetwork_name": "default",
          "vpc_name": "default"
        "start_time": "2018-01-26T12:35:10.115UTC"
      "logName": "projects/bmlabs-loggen/logs/",
      "receiveTimestamp": "2018-01-26T12:35:10.115UTC",
      "resource": {
        "labels": {
          "location": "us-central1-c",
          "project_id": "bmlabs-loggen",
          "subnetwork_id": "3656133720937113003",
          "subnetwork_name": "default"
        "type": "gce_subnetwork"
      "timestamp": "2018-01-26T12:35:10.115UTC"
    "attributes": {
      "": "2018-01-26T12:35:10.115UTC"
    "message_id": "172581793992900",
    "messageId": "172581793992900",
    "publish_time": "2018-01-26T12:35:10.115UTC",
    "publishTime": "2018-01-26T12:35:10.115UTC"
  "subscription": "projects/bmlabs-loggen/subscriptions/push-to-sumo"

Example query

Average latency (ms) by subnet ID

_collector="HTTP Source for GCP Pub/Sub" logName resource timestamp
| json "" as type 
| parse regex "\"logName\":\"(?<log_name>[^\"]+)\"" 
| where type = "gce_subnetwork" | where log_name matches "projects/*/logs/"
| json "" as resource | json field=resource "labels.location","labels.project_id","labels.subnetwork_id","labels.subnetwork_name" as zone,project,subnetwork_id,subnetwork_name nodrop
| json "", "" as labels, payload
| json field=payload "src_instance","dest_instance" as src_instance,dest_instance nodrop 
| json field=payload "src_vpc.vpc_name","dest_vpc.vpc_name" as src_vpc,dest_vpc nodrop
| json field=payload "connection.src_ip","connection.dest_ip","connection.dest_port","connection.src_port" as src_ip,dest_ip,dest_port,src_port 
| json field=src_instance "project_id", "zone", "region", "vm_name" as src_project, src_zone, src_region, src_vm nodrop 
| json field=dest_instance "project_id", "zone", "region", "vm_name" as dest_project, dest_zone, dest_region, dest_vm nodrop
| json field=payload "bytes_sent","rtt_msec","packets_sent"  as bytes, rtt,packets  
| timeslice 1m
| avg(rtt) as latency by _timeslice, subnetwork_id, subnetwork_name 
| transpose row _timeslice column subnetwork_id,subnetwork_name