Tidying up after failed Terraform tests

Andreas Wittig – 22 May 2024

Automated tests are making their way into Infrastructure as Code projects. Recently, I’ve implemented tests with Terraform’s test framework which was released in October 2023. However, I ran into the issue that Terraform could not remove all AWS resources at the end of the test under rare circumstances.

The problem and solution discussed in the following also apply to OpenTofu.

Tidying up after failed Terraform tests

Here is the typical message that terraform test will print out in case cleaning up all the resources fails.

Terraform left the following resources in state after executing
tests/default.tftest.hcl/execute, and they need to be cleaned up manually:
- aws_subnet.private[0]
- aws_subnet.private[1]
- aws_subnet.public[0]
- aws_subnet.public[1]
- aws_vpc.this
- ...

Leftover AWS resources are an issue, especially when running tests in an automated manner, causing unwanted costs. Therefore, I was looking for a solution to tidy up AWS resources regularly. The tool aws-nuke, by rebuy, deletes all resources belonging to an AWS account.

The following snippet shows the configuration file nuke-config.yml for aws-nuke. First, define which regions aws-nuke shall remove resources. global is needed to delete global resources like IAM roles and policies. For safety reasons, defining an account-blocklist with AWS account IDs that you never want to tidy up is necessary. Next, you define the accounts where you want to remove all resources. The filters are required to keep some essential resources, such as the IAM role and policy used by aws-nuke to access the AWS account.

regions:
- eu-west-1
- global
account-blocklist:
- '999999999999'
accounts:
'111111111111':
filters:
IAMRole:
- 'nuke'
IAMRolePolicy:
- type: glob
value: "nuke -> *"

My recommendation is to run aws-nuke with the dry run option activated - which is the default - and check for resources you want to keep. Then, add a filter for those resources. Learn how to install aws-nuke.

aws-nuke -c nuke-config.yml

For example, aws-nuke deletes the VPC and subnets left over from the Terraform test.

> aws-nuke-example

eu-west-1 - EC2Subnet - 'subnet-154d844e' - would remove
eu-west-1 - EC2Subnet - 'subnet-af12a261' - would remove
eu-west-1 - EC2Subnet - 'subnet-cd2fa222' - would remove
eu-west-1 - EC2Subnet - 'subnet-51223aff' - would remove
eu-west-1 - EC2VPC - 'vpc-c6159fa1' - would remove
Scan complete: 13 total, 5 nukeable, 10 filtered.

While it’s possible to run aws-nuke from your machine to ensure leftover AWS resources are regularly cleaned up, a scheduled job is the way to go. As I’m running the command terraform test within a CI/CD pipeline on GitHub, I decided to use a scheduled GitHub workflow to run aws-nuke once a day. The following snippet illustrates how to define a GitHub workflow to regularly run aws-nuke to delete resources belonging to an AWS account.

name: 'nuke'
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * *'
concurrency:
group: 'nuke'
cancel-in-progress: false
permissions:
id-token: write
contents: read
jobs:
nuke:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: 'Assuming IAM role'
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: arn:aws:iam::111111111111:role/nuke
role-session-name: nuke
aws-region: eu-west-1
- name: 'Tidying up AWS resources'
run: |
docker run -e AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID} -e AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY} -e AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN} --rm -v ./nuke-config.yml:/home/aws-nuke/config.yml quay.io/rebuy/aws-nuke:v2.25.0 --config /home/aws-nuke/config.yml --force --no-dry-run

By the way, have you heard about our solution HyperEnv for GitHub Actions Runner to spin up EC2 instances on-demand for executing GitHub workflow jobs?

Summary

Watch out for leftover AWS resources after executing Terraform tests. Periodically running aws-nuke ensures all AWS resources are deleted to avoid unwanted costs.

Andreas Wittig

Andreas Wittig

I’ve been building on AWS since 2012 together with my brother Michael. We are sharing our insights into all things AWS on cloudonaut and have written the book AWS in Action. Besides that, we’re currently working on bucketAV,HyperEnv for GitHub Actions, and marbot.

Here are the contact options for feedback and questions.