Skip to main content

Gmail Trace Logs Source

Gmail
note

This source was originally developed for Gmail logs in BigQuery, which has been replaced by Google Workspace logs and reports in BigQuery.

You'll need to use our Google BigQuery source instead (see Example 3: Query Gmail Logs).

The Gmail Trace Logs integration pulls the Gmail log from the BigQuery using BigQuery Library APIs and ingests them into the Sumo Logic to store, analyze, and alert.

note

This source is available in all deployments, including the Fed deployment.

Data collected

Polling IntervalData
5 minSet up Gmail logs in BigQuery
5 minSchema for Gmail logs in BigQuery

Setup

Vendor configuration

Follow the below steps to get the Service Account's Credential JSON file to to complete the authorization process.

  1. Open IAM & Admin under Google Cloud Console.
  2. Select the Service Account tab.
    Google_IAM_and_Admin
  3. From the project dropdown button, select the project where you will run the BigQuery jobs.
    Google_Project_Name
  4. Click on Create a Service Account and follow the instructions in Create service accounts google cloud docs.
    Google_Create_Service_Account
  5. Click on the email address provisioned during the creation and then click the KEYS tab.
    Google_Service_Account_Keys
  6. Click ADD KEY and choose Create new key.
    Google_Create_Service_Account
  7. Select key type as JSON.
    Google_Create_Service_Account
  8. Click Create. A JSON key file is downloaded to your computer.

Source configuration

When you create an Gmail Trace Logs Source, you add it to a Hosted Collector. Before creating the Source, identify the Hosted Collector you want to use or create a new Hosted Collector. For instructions, see Configure a Hosted Collector.

To configure Gmail Trace Logs Source:

  1. In Sumo Logic, select Manage Data > Collection > Collection
  2. On the Collection page, click Add Source next to a Hosted Collector.
  3. Search for and select Gmail Trace Logs.
  4. Enter a Name for the Source. The description is optional.
  5. (Optional) For Source Category, enter any string to tag the output collected from the Source. Category metadata is stored in a searchable field called _sourceCategory.
  6. (Optional) Fields. Click the +Add button to define the fields you want to associate. Each field needs a name (key) and value.
    • green check circle.png A green circle with a check mark is shown when the field exists in the Fields table schema.
    • orange exclamation point.png An orange triangle with an exclamation point is shown when the field doesn't exist in the Fields table schema. In this case, an option to automatically add the nonexistent fields to the Fields table schema is provided. If a field is sent to Sumo Logic that does not exist in the Fields schema it is ignored, known as dropped.
  7. Project ID. Enter the unique identifier number. You can find this from the Google Cloud Console.
  8. Dataset ID. Enter the ID. The Dataset ID is the project-wise unique identifier for your dataset.
  9. Data Location. Enter the location of DataSet which is set while creating Dataset in BigQuery.
  10. Private Key. Enter the private key of Service Account JSON. This is a security key which is required for authentication. You can find this from the Google Cloud Console.
  11. Client Email. Enter the user email collected from the Google Cloud Console.
  12. Token URI. Enter the token URI used for generating the token. You can find this from the Google Cloud Console.
  13. The Collection should begin is set to 24 Hours ago by default. You can adjust it based on your needs.
    note

    If you set Collection should begin to a collection time that overlaps with data that was previously ingested on a source, it may result in duplicated data to be ingested into Sumo Logic.

  14. (Optional) Processing Rules for Logs. Configure any desired filters, such as allowlist, denylist, hash, or mask, as described in Create a Processing Rule.
  15. When you are finished configuring the Source, click Save.

Metadata fields

If the integration is configured with the SIEM forward option, set the Metadata field _siemparser to /Parsers/System/Google/GCP BigQuery Gmail.

JSON schema

Sources can be configured using UTF-8 encoded JSON files with the Collector Management API. See how to use JSON to configure Sources for details. 

ParameterTypeValueRequiredDescription
schemaRefJSON Object{"type":"Gmail Trace Logs"}YesDefine the specific schema type.
sourceTypeString"Universal"YesType of source.
configJSON ObjectConfiguration objectYesSource type specific values.

Configuration Object

ParameterTypeRequiredDefaultDescriptionExample
nameStringYesnullType a desired name of the source. The name must be unique per Collector. This value is assigned to the metadata field _source."mySource"
descriptionStringNonullType a description of the source."Testing source"
categoryStringNonullType a category of the source. This value is assigned to the metadata field _sourceCategory. See best practices for details."mySource/test"
fieldsJSON ObjectNonullJSON map of key-value fields (metadata) to apply to the Collector or Source. Use the boolean field _siemForward to enable forwarding to SIEM.{"_siemForward": false, "fieldA": "valueA"}
projectIdStringYesnullThe project ID is the globally unique identifier for your project. For example, pelagic-quanta-364805.
datasetIdStringYesnullThe Dataset ID is the project-wise unique identifier for your dataset. For example, gmaillogsbigquery.
privateKeyStringYesnullThe Private Key is part of Service Account JSON, it is a security key which is required for authentication.
clientEmailStringNonullUser email collected from the Google Cloud Console.
tokenUriStringYesnullThe Token URI is part of Service Account JSON, it is used for generating the token.
startTimeStringYesnullThis sets how many hours the Source checks for new data. The default is 24 hours.
dataLocationStringYesnullDataset ID is the project-wise unique identifier for your dataset.

JSON example

{
"api.version":"v1",
"source":{
"config":{
"name":"Gmail Trace Log",
"category":"gmail",
"projectId":"Product123",
"datasetId":"Product123",
"privateKey":"*****************",
"tokenURI":"dshjfgbkjlafdhbdhfvhjksdg",
"clientEmail":"product123@gmail.com",
"dataLocation":"US",
"startTime":"24 Hours ago"
},
"schemaRef":{
"type":"Gmail Trace Logs"
},
"sourceType":"Universal"
}
}

Download example

Terraform example

resource "sumologic_cloud_to_cloud_source" "gmail_trace _logs_source" {
collector_id = sumologic_collector.collector.id
schema_ref = {
type = "Gmail Trace Logs"
}
config = jsonencode({
"name":"Gmail Trace Log",
"category":"gmail",
"projectId":"Product123",
"datasetId":"Product123",
"privateKey":"*****************",
"tokenURI":"dshjfgbkjlafdhbdhfvhjksdg",
"clientEmail":"product123@gmail.com",
"dataLocation":"US",
"startTime":"24 Hours ago"
})
}
resource "sumologic_collector" "collector" {
name = "my-collector"
description = "Just testing this"
}

Download example

FAQ

info

Click here for more information about Cloud-to-Cloud sources.

Status
Legal
Privacy Statement
Terms of Use

Copyright © 2024 by Sumo Logic, Inc.