Skip to main content
Sumo Logic

Integrate Other Tools with the Software Development Optimization Solution (Optional)

Learn how to integrate the other with  Software Development Optimization Solution.

If your DevOps pipeline has tools which are not supported at present by the Software Development Optimization (SDO) Solution, you can still integrate it with Terraform and map log events to the predefined schema model. 

Extending the current schema beyond the supported toolset OOTB toolset/FERs

If your DevOps pipeline has tools that are not supported by Out of the Box toolset of Software Development Optimization Solution then you can integrate your tool, and map it to the relevant schema model. 

For example, if you were to integrate Azure DevOps, which  provides developer services to support teams to plan work, collaborate on code development, and build and deploy applications, with the SDO solution, you would first:

  1. Install the SDO solution as documented here.

  2. Add SDO field extraction rules, to map events from your tool to the SDO event schema.

For example, if you were to map AzureDevOps Pull Request Event to Software Development Optimization Pull Request Event Schema, you would create a new FER, and extract and map fields to the pull request schema. You can use the parse expressions defined to support out-of-the-box tools in this JSON file.

json field=_raw "eventType"
| where eventType matches "git.pullrequest*"
| json "eventType", "resource.title", "createdDate", "resource.closedDate", "", "resource.status" , "resource.url", "resource.lastMergeSourceCommit.commitId", "resource.targetRefName", "resource.createdBy.displayName", "", "resource.reviewers[0].displayName" as action, title, dateTime, closeddate ,repository_name,  merge, link, commit_id, target_branch ,user, service, reviewers nodrop
| parseDate(dateTime, "yyyy-MM-dd'T'HH:mm:ss") as dateTime_epoch
| if(action matches "*merged" and merge matches "completed", "merged", if(action matches "*merged" and merge matches "active", "declined", if (action matches "*created", "created", "other"  ))) as status
| if (status="merged", parseDate(closeddate, "yyyy-MM-dd'T'HH:mm:ss") , 000000000 ) as closeddate_epoch
| toLong(closeddate_epoch)
| "pull_request" as event_type

Integrating with Terraform

How does Terraform work?

Terraform is a tool for building, changing, and versioning infrastructure safely and efficiently. Terraform utilizes plugins to utilize specific implementations such as Sumo Logic, AWS or a provisioner such as Bash.

Terraform uses remote procedure calls (RPC) to communicate with Terraform Plugins to manage resources for a specific service, such as Sumo Logic, AWS, or bash.

Software Development Optimization solution terraform utilizes Sumo Logic plugin to create resources in Sumo Logic such as Sources, FERs, Fields, etc. It also utilizes other plugins to create resources in external systems such as the Jira Terraform Provider.

SDO Terraform Script Structure

The SDO solution script is organized into following groups of files:

  • Configuration Files. 

  • Sumo Logic Resource Creation Files. 

  • Other Systems Resource Creation Files.

  • System Files:



  • Test Files:

    • integration_test.go
    • fileutil.go


  1. Determine if there is an already existing Terraform provider (core or third party) for your tool 
  2. If no Terraform provider is available, determine REST APIs which can be used to perform the actions as per your requirements, for example Webhook creation.
  3. If no provider and no REST APIs are available, your tool cannot be added to the Terraform script.
  4. Once you’ve confirmed that you can add the tool, follow this section to initialize your Terraform setup.
  5. Create the Build and Deploy schema and corresponding FERs as explained here.

Steps to Modify Terraform to Add Your Own Tool

Follow the steps below to add your tool to SDO Terraform. Note that, in the examples below, we assume that we are integrating Azure DevOps with the SDO solution.

  1. If there is an already existing Terraform provider for your tool, follow the steps below:
    • If the provider is available as an officially supported Terraform provider, add the provider name and version in For example, if we were to add Azure DevOps:

      terraform {

       required_providers {

         bitbucket = "~> 1.2"

         null      = "~> 2.1"

         #   restapi = "~> 1.12"

         template = "~> 2.1"

         #   jira = "~> 0.1.11"

         github    = "~> 2.8"

         pagerduty = "~> 1.7"

         sumologic = "~> 2.1.0",

         azuredevops = "~> 0.0.1"



    • If the provider is not an officially supported Terraform provider, follow the steps explained on the provider website for installation. 
  2. Clone the Software Development Optimization repository.

    $ git clone

  3. Navigate to the directory sumologic-solution-templates/software-development-optimization-terraform.

  4. Modify, add a flag for your tool, such as install_azuredevops and another flag for the source category such as azuredevops_sc.

    Add these 2 variables in, such as:

    # AzureDevOps

    variable "install_azuredevops" {}

    variable "azuredevops_sc" {}

  5. The Terraform script also creates sources and installs the apps in Sumo Logic. This code is present in the file, modify this file, and add source creation and app installation code, for example:

    # Create/Delete AzureDevops Source

    resource "sumologic_http_source" "azuredevops" {

     count        = "${var.install_azuredevops}" == "collection" || "${var.install_azuredevops}" == "all" ? 1 : 0

     name         = "AzureDevops"

     category     = var.azuredevops_sc

     collector_id =



    # Install Apps

    # Install AzureDevops

    resource "null_resource" "install_azuredevops_app" {

     count      = "${var.install_azuredevops}" == "app" || "${var.install_azuredevops}" == "all" ? 1 : 0

     depends_on = [sumologic_http_source.azuredevops]


     provisioner "local-exec" {

       command = <<EOT

           curl -s --request POST '${local.sumo_api_endpoint_fixed}/v1/apps/0197347ca-3b08-457c-bd15-7239f1ab66c9/install' \

               --header 'Accept: application/json' \

               --header 'Content-Type: application/json' \

               -u ${var.sumo_access_id}:${var.sumo_access_key} \

               --data-raw '{ "name": "AzureDevops", "description": "The Sumo Logic App for AzureDevops.", "destinationFolderId": "${}","dataSourceValues": {"jiralogsrc": "_sourceCategory = ${var.azureDevops_sc}" }}'





  • App UUID (0197347ca-3b08-457c-bd15-7239f1ab66c9) can be determined by navigating to the app page in the app catalog, the uuid is part of the address bar url.
  • Data Source Values (jiralogsrc) can be obtained by calling the Sumo Logic Apps Get Endpoint /v1/apps/{uuid}. Sample output:
    The field parameterId is an example of a data source value.
  1. Create a file for the configuration, such as, this file will have the variables required for the Azure Devops provider along with any user input you would like to have.

    # Sumo Logic - SDO Terraform

    # Configure AzureDevops credentials and parameters.

    azure_devops_org_service_url  = "<YOUR_AZUREDEVOPS_ORG_URL>"

    azure_devops_token = "<YOUR_AZUREDEVOPS_TOKEN>"
    Also create the corresponding variables in as specified in step 2.

  2. Create a file which will house the actual resource management code, such as

    # Sumo Logic - SDO Terraform

    # Configure the AzureDevops credentials in the


    # Configure the AzureDevops Provider

    provider "azuredevops" {

     personal_access_token  = var.azure_devops_token

     org_service_url = var.azure_devops_org_service_url


    # Manage Resources in AzureDevops

    resource "azuredevops_project" "project" {

     count             = "${var.install_azuredevops}" == "collection" || "${var.install_azuredevops}" == "all" ? 1 : 0

     project_name       = "Test Project"

     description        = "Test Project Description"

     visibility         = "private"

     version_control    = "Git"

     work_item_template = "Agile"


     features = {

         "testplans" = "disabled"

         "artifacts" = "disabled"



  3. The should have the code to manage collection in Azure Devops.
    • However, currently the Azure Provider does not support creating webhooks. 
    • You can use the Rest API Terraform Provider to manage webhooks if appropriate and Terraform compatible REST APIs are available, or this AzureDevops specific procedure. 
    • Take a look at the for an example for using Rest API Terraform Provider.
  4. Modify, add 2 properties each for the new FERs, such as azuredevops_build_fer_scope and azuredevops_build_fer_scope. Also create the corresponding variables in as earlier. 
    Make sure that the source category is not hardcoded in the FER.

  5. Modify to create FERs in Sumo Logic for AzureDevops, as below:

    resource "sumologic_field_extraction_rule" "azuredevops_build_fer" {

     count            = "${var.install_azuredevops}" == "fer" || "${var.install_azuredevops}" == "collection" || "${var.install_azuredevops}" == "all" ? 1 : 0

     depends_on       = [sumologic_http_source.azuredevops]

     name             = "AzureDevops Build"

     scope            = "_sourceCategory=${var.azuredevops_sc} ${var.azuredevops_build_fer_scope}"

     parse_expression = var.azuredevops_build_fer_parse

     enabled          = true


  6. If Sumo Logic supports outgoing webhooks for your tool, you can modify the files and along above lines.

  7. Test your code by running terraform commands:
    terraform plan
    terraform apply

If you want to test if your integration works, follow this section. 

Sumo Logic SDO Terraform utilizes Terratest for integration testing. Note that the integration test will actually create/delete resources.

The tests read the configuration files and run the Terraform commands i.e. init and apply. Terraform then stores the state and outputs the variables as defined in

Modify the so that the required ids are exported which will be used for verification by integration tests, for example here is an output variable for FER:

output "azuredevops_build_fer_id" {

 value = sumologic_field_extraction_rule.azuredevops_build_fer.*.id


Add your test to test/integration_test.go utilizing the output variable:

func validateSumoLogicAzureDevOpsBuildFER(t *testing.T, terraformOptions *terraform.Options) {


   // Run `terraform output` to get the value of an output variable

   ferID := terraform.Output(t, terraformOptions, "azuredevops_build_fer_id")

   if ferID != "[]" && getProperty("install_azuredevops") == "true" {

       ferID = strings.Split(ferID, "\"")[1]

       // Verify that we get back a 200 OK

       http_helper.HTTPDoWithCustomValidation(t, "GET", fmt.Sprintf("%s/api/v1/extractionRules/%s", sumologicURL, ferID), nil, headers, customValidation, nil)



You can also add your custom validations by making a copy of the customValidation function.

Running Integration Tests

The integration tests create real resources which might incur costs, please review the respective systems documentation for more details.

  1. Configure the execution options in ``, ``, ``, `` and ``.
  2. Configure your Sumo Logic Username by setting the environment variable:
    * `SUMOLOGIC_USERNAME`, it should be the same user with which the access id is created.
  3. Install Terraform and make sure it's on your PATH.
  4. Install Golang and make sure it's on your PATH.
  5. Execute Tests
    cd test
    dep ensure
    go test -v -timeout 30m

6. The tests are divided into multiple stages:
* deploy
* validateSumoLogic
* validateAtlassian
* validatePagerduty
* cleanup

All the stages are executed by default. If you would like to skip a stage set the environment variables like `SKIP_deploy=true`.

This is very helpful for example if you are modifying the code and do not want to create/destroy resources with each test run.

To achieve this, for the first run you would set `SKIP_cleanup=true` and all other variables should be unset.

For the second run it would be `SKIP_cleanup=true` and `SKIP_deploy=true`.

Now, you can run tests without creating/destroying resources with each run. Once you are finished, unset `SKIP_cleanup` and run the tests to clean up the resources.