This page has instructions for configuring Google BigQuery to send logs to Sumo.
Configure Google BigQuery to export logs to Stackdriver
If you haven't already done so, set up Google BigQuery to export logs to Stackdriver. For more information, see Overview of Logs Export in GCP.
Set up a Google-validated HTTP source and Pub/Sub topic
In this step, you set up an HTTP source in Sumo, register it with Google, and create a Pub/Sub topic to send data to the HTTP source. Follow the instructions in Google Cloud Platform Source.
When you create the HTTP Source, assign a Source Category of “GCP”.
Create export of Google BigQuery logs from Stackdriver
- Go to Logging and click Exports.
- Click Create Export.
Select a GCP service to filter the logs. The recommended GCP service to create sinks for is BigQuery, which sends the service’s logs to Sumo Logic. In the Edit Export window on the right:
- Set the Sink Name. For example, "gce-vm-instance".
- Set Sink Service to Cloud Pub/Sub.
- Set Sink Destination to the Pub/Sub topic you created in the Google Cloud Platform Source procedure. For example, "pub-sub-logs".
- Click Create Sink.
Created Resources Over Time
_sourceCategory=*gcp* logName resource "type":"bigquery_resource" | parse regex "\"logName\":\"(?<log_name>[^\"]+)\"" | where log_name matches "projects/*/logs/cloudaudit.googleapis.com%2Factivity" | json "message.data.resource.labels", "message.data.resource.labels.project_id" as labels, project | timeslice 1h | count as operations by _timeslice, project | transpose row _timeslice column project