Skip to main content
Sumo Logic

Collect Microsoft SQL Server Logs and Metrics for Non-Kubernetes environments

This page provides instructions to Collect Microsoft SQL Server Logs and Metrics for Non-Kubernetes environments.

Sumo Logic uses the Telegraf operator for SQL Server metric collection and the Installed Collector for collecting SQL Server logs. The diagram below illustrates the components of the SQL Server collection in a non-Kubernetes environment. Telegraf uses the SQL Server input plugin to obtain SQL Server metrics and the Sumo Logic output plugin to send the metrics to Sumo Logic. Logs from SQL Server are collected by a Local File source.

The process to set up collection for SQL Server data is done through the following steps:

  1. Configure Logs Collection
    1. Configure logging in SQL Server
    2. Configure Sumo Logic Installed Collector
    3. Configure a local file source
    4. Save
  2. Configure Metrics Collection
    1. Configure a Hosted Collector
    2. Configure an HTTP Logs and Metrics Source
    3. Install Telegraf
    4. Configure and start Telegraf

Step 1- Configure Logs Collection

This section provides instructions for configuring log collection for SQL Server running on a non-Kubernetes environment for the Sumo Logic App for SQL Server. 

  1. Make sure logging is turned on in SQL Server. Follow this documentation to enable it.

    The Microsoft SQL Server App’s queries and dashboards depend on logs from the SQL Server ERRORLOG, which is typically found at:
    C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\ERRORLOG
    The ERRORLOG is typically in UTF-16LE encoding, but verify the file encoding used in your SQL Server configuration. On Windows, this can be found by using a tool such as Notepad++.

  2. Configure an Installed Collector. If you have not already done so, install and configure an installed collector for Windows by following the documentation.

  3. Configure a local file source to ingest the ERRORLOG

  1. On the Collection Management screen, click Add, next to the collector, then select Add Source.

  2. Select Local File as the source type.

  3. Configure the Local File Source fields as follows:

  • Name (Required). Enter a name for the source.
  • Description (Optional).
  • File Path (Required). Enter the path to your ERRORLOG and be sure to account for log rotations by using a wildcard: C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\ERRORLOG*
  • Collection should begin. Set this for how far back historically you want to start collecting.
  • Source Host (Optional). Sumo Logic uses the hostname assigned by the OS unless you enter a different host name
  • Source Category (Recommended). Be sure to follow the Best Practices for Source Categories. A recommended Source Category may be Prod/DB/MSSQL/ERRORLOG.
  • Fields. Set the following fields:
    • component = database
    • db_system = SQLserver
    • db_cluster = <Your_SQLserver_Cluster_Name>. Enter Default if you do not have one.
    • environment = <Your_Environment_Name> (for example, Dev, QA, or Prod)
  1. In the Advanced section, select the following options:

  • Enable Timestamp Parsing. Select Extract timestamp information from log file entries.
  • Time Zone. Choose the option, Ignore time zone from the log file and instead use, and then select your SQL Server Server’s time zone.
  • Timestamp Format. The timestamp format is automatically detected.
  • Encoding. Select the encoding of the ERRORLOG you found in step 1, which will typically be UTF-16LE.
  • Enable Multiline Processing. Detect messages spanning multiple lines
    • Infer Boundaries - Detect message boundaries automatically
  1. Click Save.
    At this point, the installed collector will start scanning the ERRORLOG and send it to Sumo Logic

Step 2 - Configure Metrics Collection

Setup a Sumo Logic HTTP Source
  1. Configure a Hosted Collector for Metrics.
    To create a new Sumo Logic hosted collector, perform the steps in the Configure a Hosted Collector documentation.
  2. Configure an HTTP Logs & Metrics source:
    1. On the created Hosted Collector on the Collection Management screen, select Add Source.
    2. Select HTTP Logs & Metrics.
      1. Name. (Required). Enter a name for the source.
      2. Description. (Optional).
      3. Source Category (Recommended). Be sure to follow the Best Practices for Source Categories. A recommended Source Category may be Prod/DB/MSSQL/Metrics.
    3. Select Save.
    4. Take note of the URL provided once you click Save. You can retrieve it again by selecting Show URL next to the source on the Collection Management screen.
Setup Telegraf
  1. Install Telegraf if you haven’t already. Use the following steps to install Telegraf.
  2. Configure and start Telegraf.
    As part of collecting metrics data from Telegraf, we will use the SQL Server input plugin to get data from Telegraf and the Sumo Logic output plugin to send data to Sumo Logic. 

    Before you configure Telegraf, you will need to create a login on every SQL Server instance you want to monitor with the following script. This script will create a user in SQL Server which will be used as input to the Telegraf configuration.
USE master;
GO
CREATE LOGIN [<Username_TO_BE_CHANGED>] WITH PASSWORD=N'<Password_TO_BE_CHANGED>';
GO
GRANT VIEW SERVER STATE TO [<Username_TO_BE_CHANGED>];
GOGRANT VIEW ANY DEFINITION TO [<Username_TO_BE_CHANGED>];
GO

Create or modify telegraf.conf and copy and paste the text below:  

[[inputs.sql server]]
   servers = [ "Server=<IP_TO_BE_CHANGED>;Port=<Port_TO_BE_CHANGED>;User Id=<Username_TO_BE_CHANGED>;Password=<Password_TO_BE_CHANGED>;app name=telegraf;log=1;",]
  database_type = "SQLServer"
  exclude_query = [ 'SQLServerSchedulers' , 'SQLServerRequests' ]
           tagexclude = ['physical_filename','volume_mount_point'] 
    [inputs.sqlserver.tags]
   environment="prod"
   component="database"
   db_cluster="<sqlserver_TO_BE_CHANGED>"
   db_system = "sqlserver"
[[outputs.sumologic]]
 url = "<URL_from_HTTP_Logs_and_Metrics_Source>”
 data_format = "prometheus"
[outputs.sumologic.tagpass]   db_cluster=["<sqlserver_TO_BE_CHANGED>"]

Enter values for fields annotated with <VALUE_TO_BE_CHANGED> to the appropriate values. Do not include the brackets (<>) in your final configuration

  • Input plugins section, which is [[inputs.sqlserver]]:
    • servers - The URL to the SQL server. For more information on additional parameters to configure the SQL Server input plugin for Telegraf see this doc.
  • In the tags section, which is [inputs.sqlserver.tags]:
    • environment - This is the deployment environment where the SQL Server cluster identified by the value of servers resides. For example; dev, prod, or QA. While this value is optional we highly recommend setting it. 
    • db_cluster - Enter a name to identify this SQLserver cluster. This cluster name will be shown in our dashboards. 
  • In the output plugins section, which is [[outputs.sumologic]]
    • URL - This is the HTTP source URL created previously. See this doc for more information on additional parameters for configuring the Sumo Logic Telegraf output plugin.

Here’s an explanation for additional values set by this Telegraf configuration.

  • data_format: “prometheus” - In the output [[outputs.sumologic]] plugins section. Metrics are sent in the Prometheus format to Sumo Logic.
  • Component - “database” - In the input [[inputs.SQLserver]] plugins section. This value is used by Sumo Logic apps to identify application components.
  • db_system - “sqlserver” - In the input plugins sections. This value identifies the database system.

See this doc for all other parameters that can be configured in the Telegraf agent globally.

At this point, Telegraf should start collecting the SQL Server metrics and forward them to the Sumo Logic HTTP Source.