Databricks Audit Source

The Sumo Logic source for Databricks enables you to collect audit logs from Databricks into Sumo Logic. This integration helps you to capture structured records of user and system activities within the Databricks workspace, including SQL queries, job executions, cluster events, and workspace changes. These logs facilitate auditing, security monitoring, and regulatory compliance.
Data collected
Polling Interval | Data |
---|---|
5 minutes | Audit Logs |
Setup
Vendor configuration
The Databricks Audit source requires you to provide the Databricks Audit Base URL (API Gateway URL), Warehouse ID, Client ID, and Client Secret to configure the source. Follow the below steps to generate the required values:
Base URL
Follow the below steps to get the Base URL for user configuration:
- Sign in to your Databricks workspace.
- The Base URL is visible in your browser’s address bar after login. For example,
https://YOUR-INSTANCE.databricks.com
.
Warehouse ID
The Warehouse ID is required to query Databricks SQL and fetch audit logs. Follow the below steps to get the Warehouse ID for user configuration:
- Sign in to your Databricks workspace.
- Navigate to SQL Warehouses in the sidebar.
- Click the warehouse name you want to use.
- On the warehouse details page, select the Properties tab.
- Locate and copy the Warehouse ID. For example,
bd4dc8ef7e54782c
.
Client ID and Client Secret
To generate the Client ID and Client Secret, refer to the Create an OAuth Secret section.
Source configuration
When you create a Databricks Audit Source, you add it to a Hosted Collector. Before creating the Source, identify the Hosted Collector you want to use or create a new Hosted Collector. For instructions, see Configure a Hosted Collector and Source.
To configure Databricks Audit Source:
- New UI. In the Sumo Logic main menu select Data Management, and then under Data Collection select Collection. You can also click the Go To... menu at the top of the screen and select Collection.
Classic UI. In the main Sumo Logic menu, select Manage Data > Collection > Collection. - On the Collectors page, click Add Source next to a Hosted Collector.
- Search for and select the Databricks Audit icon.
- Enter a Name to display for the Source in Sumo Logic. The description is optional.
- (Optional) For Source Category, enter any string to tag the output collected from the Source. Category metadata is stored in a searchable field called
_sourceCategory
. - (Optional) Fields. Click the +Add Field link to define the fields you want to associate. Each field needs a name (key) and value.
A green circle with a check mark is shown when the field exists and is enabled in the Fields table schema.
An orange triangle with an exclamation point is shown when the field doesn't exist in the Fields table schema. In this case, you'll see an option to automatically add or enable the nonexistent fields to the Fields table schema. If a field is sent to Sumo Logic that does not exist in the Fields schema it is ignored, known as dropped.
- Enter the Base URL of your account.
- Enter the Warehouse ID collected from the [vendor configuration](#vendor-configuration] to fetch audit logs.
- Enter the Client ID and Client Secret collected from the [vendor configuration](#vendor-configuration] to authorize access to your Databricks resources.
- The Polling Interval is set for 5 minutes by default. You can adjust it based on your needs.
- When you are finished configuring the Source, click Save.
JSON schema
Sources can be configured using UTF-8 encoded JSON files with the Collector Management API. See Use JSON to Configure Sources for details.
Parameter | Type | Value | Required | Description |
---|---|---|---|---|
schemaRef | JSON Object | {"type":"Databricks Audit Logs"} | Yes | Define the specific schema type. |
sourceType | String | "Universal" | Yes | Type of source. |
config | JSON Object | Configuration object | Yes | Source type specific values. |
Configuration Object
Parameter | Type | Required | Default | Description | Example |
---|---|---|---|---|---|
name | String | Yes | null | Type a desired name of the source. The name must be unique per Collector. This value is assigned to the metadata field _source . | "mySource" |
description | String | No | null | Type a description of the source. | "Testing source" |
category | String | No | null | Type a category of the source. This value is assigned to the metadata field _sourceCategory . See best practices for details. | "mySource/test" |
fields | JSON Object | No | null | JSON map of key-value fields (metadata) to apply to the Collector or Source. Use the boolean field _siemForward to enable forwarding to SIEM. | {"_siemForward": false, "fieldA": "valueA"} |
baseURL | String | Yes | null | Base URL of the Databricks workspace. For example, https://<workspace-name>.databricks.com . | |
warehouseID | String | Yes | null | Unique identifier of the SQL Warehouse within the Databricks workspace, used to query and fetch audit logs. For example, bd4dc8ef7e54782c . | |
clientID | String | Yes | null | Client ID of the account. | |
clientSecret | String | Yes | null | Client Secret of the account | |
pollingIntervalMin | Integer | No | 5 minutes | Time interval after which the source will check for new data. Minimum: 5 minutes Maximum: 24 hours |
JSON example
loading...
Terraform example
loading...
FAQ
Click here for more information about Cloud-to-Cloud sources.