This page has instructions for configuring Google Cloud SQL to send logs to Sumo.
Configure Cloud SQL to export logs to Stackdriver
If you haven't already done so, set up Google Cloud SQL to export logs to Stackdriver. For more information, see Overview of Logs Export in GCP.
Set up a Google-validated HTTP source and Pub/Sub topic
In this step, you set up an HTTP source in Sumo, register it with Google, and create a Pub/Sub topic to send data to the HTTP source. Follow the instructions in Google Cloud Platform Source.
When you create the HTTP Source, assign a Source Category of “GCP”.
Create export of Cloud SQL logs from Stackdriver
- Go to Logging and click Exports.
- Click Create Export.
Select a GCP service to filter the logs. The recommended GCP service to create sinks for is "Cloud SQL Database", which sends the service’s logs to Sumo Logic. In the Edit Export window on the right:
- Enter a Sink Name. For example, "gce-vm-instance".
- Select "Cloud Pub/Sub" as the Sink Service.
- Set Sink Destination to the Pub/Sub topic you created in the Google Cloud Platform Source procedure. For example, "pub-sub-logs".
- Click Create Sink.
Created Resources Over Time
_sourceCategory=*gcp* data "type":"cloudsql_database" methodName | parse regex "\"logName\":\"(?<log_name>[^\"]+)\"" | where log_name matches "projects/*/logs/*" | json "message.data.resource.labels", "message.data.protoPayload.methodName" as labels, method | json field=labels "database_id", "project_id", "region" as instance, project, region | json "message.data.protoPayload.authorizationInfo[*]" as permissions | parse regex field=permissions "\"permission\":\"(?<resource_type>[^\"]+)\.(?<method>[^\"]+?)\"" multi | where method = "create" | timeslice 1h | count as creations by _timeslice, resource_type | transpose row _timeslice column resource_type