This page has instructions for configuring a pipeline for shipping logs available from Azure Blob Storage to an Event Hub, on to an Azure Function, and finally to an HTTP source on an hosted collector in Sumo Logic.
- Only General-purpose v2 (GPv2) and Blob storage accounts are supported. This integration does not support General-purpose v1 (GPv1) accounts.
- Configure your storage account in the same location as your Azure Service.
- This solution supports only log files from Blob storage that have file extensions of .csv, .json, .blob, or .log.
- You configure the Azure service to export logs to a container in a storage account created for that purpose.
- You create an Event Grid subscription with the storage container as publisher and the event hub (created by the Sumo-provided ARM) as subscriber. Event Grid routes block blob creation events to event hub.
- Event Hub streams the events to the TaskProducer Azure function, which creates tasks (a JSON object that specifies start and end byte, container name, blob path) and pushes those tasks to the service bus task queue.
- The TaskConsumer Azure function, which is triggered when the service bus receives a new task, reads the block blob, from start byte to stop byte, and sends that data to Sumo.
- The set up also includes failure handling mechanism. For more information about the solution strategy, see Azure Blob Storage.
Step 1. Configure Azure storage account
In this step you configure a storage account to which you will export monitoring data for your Azure service.
- Create a new storage account. (The storage account must be a General-purpose v2 (GPv2) storage account.) For instructions, see Create a storage account in Azure help.
- In Azure portal, navigate to the storage account you created in the previous step.
- Under Settings, select Access keys and make a note of the Connection String value in the key1 section.
- Select Containers under Blob Service.
- Select + Container,
- Enter the Name
- Select Private for the Public Access Level.
- Click OK.
Make a note of the container name, you will need to supply later in this procedure.
Step 2. Configure an HTTP source
In this step, you configure an HTTP source to receive logs from the Azure function.
- Select a hosted collector where you want to configure the HTTP source. If desired, create a new hosted collector, as described on Configure a Hosted Collector.
- Configure an HTTP source, as described on HTTP Logs and Metrics Source. Make a note of the URL for the source, you will need it in the next step.
Step 3. Configure Azure resources using ARM template
In this step, you use a Sumo-provided Azure Resource Manager (ARM) template to create an Event Hub, three Azure functions, Service Bus Queue, and a Storage Account.
- Download the blobreaderdeploy.json ARM template.
- Go to Template deployment in the Azure Portal.
- Click Create.
- On the Custom deployment blade, click Build your own template in the editor.
- Copy the contents of the template and paste it into the editor window.
- Click Save.
- Now you are back on the Custom deployment blade.
- Create a new Resource Group (recommended) or select an existing one.
- Choose Location.
- Set the value of the
SumoEndpointURLparameter to the URL for the HTTP source you configured in Step 2.
- Set the value of the
StorageAcccountConnectionStringparameter to the value of the connection string you noted in Step 1.
- Agree to the terms and conditions.
- Click Purchase.
- Verify the deployment was successful by looking at Notifications at top right corner of Azure Portal.
- (Optional) In the same window, you can click Go to resource group to verify the all resources were successfully created. You will see something like this:
- Go to Storage accounts and search for “sumobrlogs”. Click on “sumobrlogs<random-string>”.
- Under Table Service:
- Click Tables.
- Click + Table.
- For Name, enter “FileOffsetMap:
- Click OK.
Step 4. Create an Event Grid Subscription
- In the left pane of Azure portal click All Services. Search for “Event Grid Subscriptions” and click it.
- Click +Event Subscription. The Create Event Subscription pane appears.
- In the Create Event Subscription pane appears:
- Topic Type. Select Storage Accounts.
- Subscription. Select the Subscription.
- Resource Group. Select the Resource Group for the Storage Account to which your Azure service will export logs, which you created in Step 1.
- Resource. Select the Storage Account you configured in Step 1.
- In the Event Types section:
- Uncheck the Subscribe to all event types box.
- Select Blob Created from the Define Event Types dropdown.
- Endpoint Type. Select Event Hubs from the dropdown.
- Endpoint. Click on Select an endpoint.
- The Select Event Hub popup appears:
- Resource Group. Select the resource group you created Step 3.
- Event Hub Namespace. Select
- Event Hub. Select
blobreadereventhubfrom the dropdown.
- Click Confirm Selection.
- In the Event Subscription Details section:
- Name. Enter the subscription name.
- Event Schema. Select Event Grid Schema from the dropdown
- In the Filters section, to filter events by container name, enter the following in the Subject Begins With field, replacing
<container_name>with the name of the container you created in Step 1:
- Click Create
- Verify the deployment was successful by looking at Notifications in the top right corner of the Azure Portal.
Step 5. Push logs from Azure Service to Azure Blob Storage
This section describes how to push logs from an Azure service to Azure Blob Storage by configuring Diagnostic Logs. The instructions use the Azure Web Apps Service as an example.
- Login to the Azure Portal.
- Click AppServices > Your Function App > Diagnostic Logs under Monitoring.
- You will see the Diagnostic Logs blade. Enable Application Logging, Web Server Logging, or both, and click Storage Settings.
- Select the Storage Account whose connection string you configured in Step 1.
- You will see the Containers blade. Select the container you created in Step 1.
- You will see the Diagnostic Logs blade again. Specify the retention days and click Save to exit Diagnostic Logs configuration.