Streamlining Log Management to Amazon S3 Using Atlas Push-based Log Exports With HashiCorp Terraform
Rate this tutorial
As the volumes of data managed by organizations grow exponentially, effective log management becomes critical for maintaining the performance and security of database platforms.
Automation and configuration of log management through infrastructure as code (IaC) tools can enable users to automatically push mongod, mongos, and audit logs directly to their S3 buckets instead of manually downloading zipped log files or creating any custom solutions to continuously pull logs from Atlas.
To enhance developer agility, we’re introducing a new capability that enables developer teams to push logs to an Amazon S3 bucket, providing a continuous, scalable, cost-effective solution for log storage and analysis.
This guide will walk you through how to set up push-based logging to Amazon S3 in Atlas through our HashiCorp Terraform MongoDB Atlas provider. Let’s get started!
- Configure your AWS credentials to be used with Terraform.
You can enable the push-based log export feature on a project level in your Atlas organization after you have authorized your Atlas project to access the S3 bucket in your AWS account. This means all clusters in your Atlas project will be automatically configured to push logs to the S3 bucket.
So to start off, let’s give Atlas what it needs to get the appropriate permissions!
Note that if you prefer to manage IAM roles and permissions via UI or other tools, you can accomplish this by following the appropriate steps outlined in Set Up Unified AWS Access, skipping Step #2, and configuring the appropriate roles in your Terraform configuration.*
In this tutorial, we will use Terraform to manage all operations so you don’t have to worry about jumping between your AWS Management Console, your MongoDB Atlas account, and your Terraform configuration files. This requires you to first specify the HashiCorp Terraform MongoDB Atlas provider as well as Hashicorp Terraform AWS provider versions in a versions.tf file as below:
Note: Before deploying anything, be sure to store the MongoDB Atlas programmatic API keys you created as part of the prerequisites as environment variables, and additionally, ensure to have configured your credentials for the AWS provider.
We will now create a variables.tf file for declaring Terraform variables and a terraform.tfvars file for defining variable values. These files are typically created within the root directory of your Terraform project.
In variables.tf, we will define the identifier of your Atlas organization, the name of your new Atlas project, and the name of the new S3 bucket that the logs will be pushed to:
In the terraform.tfvars file, we will configure the values for the above variables:
We can now run
terraform init
in the terminal. This will initialize Terraform and download Terraform MongoDB Atlas and AWS providers.To set up unified AWS access, you must give Organization Owner or Project Owner access to the Atlas project.
From here, set up cloud provider access in MongoDB Atlas. You can do this with the
mongodbatlas_cloud_provider_access_setup
resource which returns an atlas_aws_account_arn
and atlas_assumed_role_external_id
.At this point, your *.tf configuration file should look similar to this:
Next, create an IAM role with the
assume_role_policy
in Atlas configured with the Atlas AWS account from the previous step as the principal.Finally, use
mongodbatlas_cloud_provider_access_authorization
resource to authorize and configure the new IAM Assumed Role ARN.Tip: For now, we have only authorized this role for S3 actions but once successful, you can expand the policy and use the role_id value when configuring other Atlas services that use AWS, such as Data Federation and encryption at Rest.
Now that you have given Atlas the authorization to Amazon S3 service, we will create a new S3 bucket (or use an existing bucket) and configure an
aws_iam_role_policy
resource to allow the previously created atlasRole
to perform operations required for pushing the logs to this bucket:Let’s deploy everything so far by running the below commands from the terminal:
With that, you have configured your S3 bucket!
Coming back to where it all started, you can now very easily enable push-based log export configuration for your Atlas project. All you need is the name of the S3 bucket and the
role_id
from the mongodbatlas_cloud_provider_access_authorization
resource. Add the below to your *.tf file:To deploy again, run from the terminal:
If your deployment was successful, you should be greeted with “Apply complete!”
You can verify the configuration by fetching objects in the S3 bucket. You can do this by updating your Terraform configuration to include the
aws_s3_objects
data source and specify the bucket name and the prefix path like this:List the objects in the bucket by creating an outputs.tf file and referencing the
keys
parameter as below:Run
terraform plan
followed by terraform apply
in the terminal. You should now see an “atlas-test” object created in your S3 bucket in the outputs:Atlas creates this test object to verify that the IAM role configured has write access to this S3 bucket for the purposes of push-based log exporting. It is safe to delete this file once the log export configuration is successful.
Now that you have successfully configured push-based log export to your S3 bucket, as soon as a cluster is deployed in your Atlas project, you should see the logs in your configured S3 bucket. Happy logging!
Congratulations! You now have everything that you need to start pushing your MongoDB Atlas database logs to your Amazon S3 bucket.
The HashiCorp Terraform Atlas provider is open-sourced under the Mozilla Public License v2.0 and we welcome community contributions. To learn more, see our contributing guidelines.
The fastest way to get started is to create a MongoDB Atlas account from the AWS Marketplace. To learn more about the Terraform provider, check out our documentation, tutorials, solution brief, or get started directly.
Go build with MongoDB Atlas and the HashiCorp Terraform Atlas provider today!
Top Comments in Forums
There are no comments on this article yet.