Skip to main content
Sumo Logic

Collect Logs and Metrics for the Amazon DynamoDB App

Steps to collect logs and metrics from DynamoDB and to ingest them into Sumo.

Collect Metrics for Amazon DynamoDB

  • Metadata: Add an account field to the source and assign it a value that is a friendly name/alias to your AWS account from which you are collecting metrics. This name will appear in the Sumo Logic Explorer View. Metrics can be queried via the “account field”.

clipboard_eb7a52061f6a7e667ba4f39c9a2515dd1.png

Collect Amazon DynamoDB CloudTrail Logs

  1. To your Hosted Collector, add an AWS CloudTrail Source.
    1. Name. Enter a name to display the new Source.
    2. Description. Enter an optional description.
    3. S3 Region. Select the Amazon Region for your Amazon DynamoDB S3 bucket.
    4. Bucket Name. Enter the exact name of your Amazon DynamoDB S3 bucket.
    5. Path Expression. Enter the string that matches the S3 objects you'd like to collect. You can use a wildcard (*) in this string. (DO NOT use a leading forward slash. See Amazon Path Expressions.)
    6. Source Category. Enter aws/observability/cloudtrail/logs
    7. Fields. Add an account field and assign it a value that is a friendly name/alias to your AWS account from which you are collecting logs. This name will appear in the Sumo Logic Explorer View. Logs can be queried via the “account field”.clipboard_e0c6c53431134a228dbcb4f318be074a3.png
    8. Access Key ID and Secret Access Key. Enter your Amazon Access Key ID and Secret Access Key. Learn how to use Role-based access to AWS here
    9. Log File Discovery -> Scan Interval. Use the default of 5 minutes. Alternately, enter the frequency. Sumo Logic will scan your S3 bucket for new data. Learn how to configure Log File Discovery here.
    10. Enable Timestamp Parsing. Select the check box.
    11. Time Zone. Select Ignore time zone from the log file and instead use, and select UTC.
    12. Timestamp Format. Select Automatically detect the format.
    13. Enable Multiline Processing. Select the check box, and select Infer Boundaries.
  2. Click Save.

Field in Field Schema

Login to Sumo Logic, go to Manage Data > Logs > Fields. Search for the “tablename” field. If not present, create it. Learn how to create and manage fields here.

Field Extraction Rule(s)

Create Field Extraction Rule for CloudTrail Logs. Learn how to create Field Extraction Rule here.

Rule Name: AwsObservabilityDynamoDBCloudTrailLogsFER
Applied at: Ingest Time
Scope (Specific Data): 
account=* eventname eventsource "dynamodb.amazonaws.com"
Parse Expression:
| json "eventSource", "awsRegion", "requestParameters.tableName", "recipientAccountId" as eventSource, region, tablename, accountid nodrop
| where eventSource = "dynamodb.amazonaws.com"
| "aws/dynamodb" as namespace
| tolowercase(tablename) as tablename
| fields region, namespace, tablename, accountid

Centralized AWS CloudTrail Log Collection

In case you have a centralized collection of cloudtrail logs and are ingesting them from all accounts into a single Sumo Logic cloudtrail log source, create following Field Extraction Rule to map proper AWS account(s) friendly name/alias. Create it if not already present / update it as required.

Rule Name: AWS Accounts
Applied at: Ingest Time
Scope (Specific Data): 
_sourceCategory=aws/observability/cloudtrail/logs

Parse Expression:

Enter a parse expression to create an “account” field that maps to the alias you set for each sub-account. For example, if you used the “dev” alias for an AWS account with ID "528560886094" and the “prod” alias for an AWS account with ID "567680881046", your parse expression would look like this:

| json "recipientAccountId"
// Manually map your aws account id with the AWS account alias you setup earlier for individual child account
| "" as account
| if (recipientAccountId = "528560886094",  "dev", account) as account
| if (recipientAccountId = "567680881046",  "prod", account) as account
| fields account