🆕 Video Course Out Now: Rapid Docker on AWS

Terraform, can you keep a secret?

Andreas Wittig – 29 May 2020

Did you know that Terraform state can - and most likely does - contain sensitive data? A few examples of sensitive information stored in the Terraform state:

Terraform, can you keep a secret?

When using Terraform to provision cloud infrastructure on AWS, it is common to use S3 and DynamoDB to store the Terraform state as well. When doing so, Terraform will store sensitive information on S3. By default, the confidential data is neither encrypted at rest nor protected from access from other users or roles from the same AWS account.

Terraform remote state with S3 backend

When using the S3 backend to manage your Terraform state, you should not forget to enable encryption-at-rest and tight access control to the S3 bucket.

Read on to learn how to protect your sensitive information.

Cover of Rapid Docker on AWS

Become a Docker on AWS professional!

Our book Rapid Docker on AWS is designed for DevOps engineers and web developers who want to run dockerized web applications on AWS. We lead you with many examples: From dockerizing your application to Continuous Deployment and Infrastructure as Code on AWS. No prior knowledge of Docker and AWS is required. Get the first chapter for free!

Encryption-at-Rest

Enabling S3 Default Encryption will automatically encrypt the Terraform state when stored on S3. It’s only server-side encryption, but still much better than storing your sensitive information unencrypted.

For full control, I recommend using a customer-managed CMK managed by the Key Management Service (KMS) when configuring the default encryption for your S3 bucket.

The following snippet shows how to enable default encryption with CloudFormation.

StateBucket:
Type: AWS::S3::Bucket
Properties:
BucketEncryption:
ServerSideEncryptionConfiguration:
- ServerSideEncryptionByDefault:
KMSMasterKeyID: {'Fn::ImportValue': !Sub '${ParentKmsKeyStack}-KeyArn'}
SSEAlgorithm: 'aws:kms'
BucketName: !Ref TerrformStateIdentifier

Access Control

First of all, use a separate S3 bucket to store your Terraform state. I recommend creating an S3 bucket per AWS account and region.

Next, we need to follow the least privilege principle for read and write requests to the S3 bucket. The best way to restrict access to an S3 bucket very tightly is to make use of a bucket policy.

The following bucket policy grants the IAM role tfadmin full access to administer the S3 bucket. The IAM user tfuser is only granted read and write access to the objects within the bucket. Everyone else is neither allowed to modify the bucket nor to access the data stored within the bucket.

As the bucket policy uses Deny statements with an NotPrincipal element, it is necessary to specify the account (arn:aws:iam::111111111111:root in my example) as well as the assumed-role user when using IAM roles. Check out NotPrincipal with Deny to learn more.

{
"Version": "2008-10-17",
"Statement": [
{
"Effect": "Deny",
"NotPrincipal": {
"AWS": [
"arn:aws:iam::111111111111:root",
"arn:aws:iam::111111111111:role/tfadmin",
"arn:aws:sts::111111111111:assumed-role/tfadmin/session",
]
},
"NotAction": [
"s3:ListBucket",
"s3:GetObject",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::tfstate-cloudonaut-demo",
"arn:aws:s3:::tfstate-cloudonaut-demo/*"
]
},
{
"Effect": "Deny",
"NotPrincipal": {
"AWS": [
"arn:aws:iam::111111111111:root"
"arn:aws:iam::111111111111:user/tfuser",
"arn:aws:iam::111111111111:role/tfadmin",
"arn:aws:sts::111111111111:assumed-role/tfadmin/session",
]
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::tfstate-cloudonaut-demo",
"arn:aws:s3:::tfstate-cloudonaut-demo/*"
]
}
]
}

Summary

Terraform will not keep your secrets! Sensitive information like database passwords, secrets stored within the Parameter Store, or shared keys for a VPN connection is at risk. Therefore, you should enable encryption-at-rest and use a bucket policy to tightly control who can access your Terraform state when using Terraform’s S3 backend.

Are you looking for a ready-to-use implementation? I’ve added a CloudFormation template to our open-source project widdix/aws-cf-templates that you can use to create an S3 bucket and DynamoDB optimized for the use as Terraform state backend. Check out the documentation to get started.

Andreas Wittig

Andreas Wittig

I’m an independent consultant, technical writer, and programming founder. All these activities have to do with AWS. I’m writing this blog and all other projects together with my brother Michael.

In 2009, we joined the same company as software developers. Three years later, we were looking for a way to deploy our software—an online banking platform—in an agile way. We got excited about the possibilities in the cloud and the DevOps movement. It’s no wonder we ended up migrating the whole infrastructure of Tullius Walden Bank to AWS. This was a first in the finance industry, at least in Germany! Since 2015, we have accelerated the cloud journeys of startups, mid-sized companies, and enterprises. We have penned books like Amazon Web Services in Action and Rapid Docker on AWS, we regularly update our blog, and we are contributing to the Open Source community. Besides running a 2-headed consultancy, we are entrepreneurs building Software-as-a-Service products.

We are available for projects.

You can contact me via Email, Twitter, and LinkedIn.

Briefcase icon
Hire me