Building a Local AWS Environment with LocalStack and Terraform
When working on development related to AWS, have you ever thought, "I wish I could do everything locally, including development and testing"? Even though AWS offers a free tier, setting up a development account can be a bit of a hassle. Plus, there’s always the risk of accidentally exceeding the free tier limits.
Back when I was just starting out (around 5-6 years ago), I set up an AWS account to learn and experiment. One day, I was shocked to discover a hefty bill had racked up. I had forgotten to close the account I used for learning, and various services kept running... A very expensive lesson learned.
Even if your company provides a shared development account, your actions might impact others, potentially leading to unintended charges.
In this article, I’ll introduce the steps to set up your local environment for testing AWS services using the following tools:
- Docker desktop
- LocalStack (a service that provides mock AWS environments)
- tarraform-local (a handy tool for running Terraform locally)
While it's possible to use aws-cli for setting things up, this article will focus on using Terraform to build AWS services. This setup could also be useful as a learning environment for Terraform.
Target Audience
This article is intended for Mac users.
Tool Installation
Docker Desktop
Install Docker Desktop by downloading the appropriate version for your OS from the official site:
- [Mac] https://docs.docker.com/desktop/install/mac-install/
- [Windows] https://docs.docker.com/desktop/install/windows-install/
- [Linux] https://docs.docker.com/desktop/install/linux-install/
terraform-local
Installing Terraform is required to follow the steps in this article. If you haven’t installed it yet, please refer to this article for guidance.
Here is the official repository for terraform-local
. If you encounter issues with commands not working, refer to this resource.
The official instructions suggest installing the package via pip
, but I found that installing with brew
also works, so I recommend the following:
brew install terraform-local
Using terraform-local
allows you to skip the AWS credential checks when running Terraform.
Once the installation is complete, you can execute tflocal
from the terminal. Confirm it with the following command:
tflocal -v
This should display the current version of Terraform installed:
Terraform v1.9.5
The commands used with tflocal
are basically the same as those used with Terraform:
Terraform | terraform-local |
---|---|
terraform init | tflocal init |
terraform plan | tflocal plan |
terraform apply | tflocal apply |
terraform destroy | tflocal destroy |
With that, the necessary tools are installed.
Setting Up a LocalStack Environment
LocalStack allows you to set up a mock AWS environment locally. The services available depend on the license you have.
While a Pro
license grants access to most services, my motivation is to avoid unnecessary charges, so I’ve opted not to purchase it.
Let’s set up LocalStack. I recommend setting it up in a Docker environment for easy removal or initialization when no longer needed.
Here’s the directory structure we’ll be working with, using an API Lambda setup as a sample case:
application/
├── compose.yml ... LocalStack container definition
├── docker/
│ └── localstack/
│ ├── terraform/ ... Terraform files for setting up the LocalStack environment
│ │ └── hello-world/ ... A directory for each Lambda function
│ │ ├── zip/ ... Directory for the Lambda zip file
│ │ └── main.tf
│ └── volume/ ... LocalStack container mount point
└── src/
└── lambda/
└── hello-world
└── index.js ... Function to be deployed to Lambda
Let’s create a Docker Compose file. Although the official implementation examples are available, I’ve modified it slightly to fit this directory structure.
It’s recommended by the official documentation to name Docker Compose files as compose.yaml
.
version: "3.8"
services:
localstack:
container_name: "${LOCALSTACK_DOCKER_NAME:-localstack-main}"
image: localstack/localstack
ports:
- "127.0.0.1:4566:4566" # LocalStack Gateway
- "127.0.0.1:4510-4559:4510-4559" # external services port range
environment:
# LocalStack configuration: https://docs.localstack.cloud/references/configuration/
- DEBUG=${DEBUG:-0}
volumes:
- "./docker/localstack/volume:/var/lib/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
Move to the project’s root directory and execute the following command. This completes the creation of the LocalStack Docker container.
docker compose up -d
Lambda Function
Let’s prepare a Lambda function as a sample case. Here, we’ll refer to the simple "Hello World" Lambda function provided by LocalStack’s official documentation, which returns the string "Hello World".
exports.handler = async (event) => {
return {
statusCode: 200,
body: "Hello World!",
};
};
We’ll upload the code to Lambda as a zip file. Zip the file using the following command and place it in docker/localstack/terraform/hello-world/zip
.
Name the zip file function.zip
.
zip function.zip index.js
mv function.zip path/to/zip/folder
Terraform file
We’ll use Terraform to create a Lambda within the LocalStack environment. Here’s the Terraform configuration:
provider "aws" {
region = "us-east-1"
access_key = "local_access_key"
secret_key = "local_secret_key"
skip_credentials_validation = true
skip_region_validation = true
skip_requesting_account_id = true
skip_metadata_api_check = true
endpoints {
lambda = "http://localhost:4566"
}
}
resource "aws_lambda_function" "hello-world-lambda" {
function_name = "HelloWorldFunction"
filename = "./zip/function.zip"
role = "arn:aws:iam::000000000000:role/lambda-role"
runtime = "nodejs20.x"
handler = "index.handler"
timeout = 60
memory_size = 128
}
When executing this file with regular Terraform commands, credentials and region validity are usually checked. By using terraform-local, you can skip these validations.
Deploying the HelloWorldFunction Lambda
Congratulations! You’ve completed the preparations to create a Lambda in LocalStack. Now, let’s deploy the environment. First, navigate to the directory where the Terraform files are located:
cd application/docker/localstack/terraform/hello-world
Next, run the terraform-local command:
tflocal init
tflocal apply
The details of the Lambda you’re creating will be output to the console. This output should match the contents of your Terraform file. Review the details, then proceed with the creation.
Executing the HelloWorldFunction Lambda
Let’s execute the Lambda we just created. You’ll need to install aws-cli to do this. If you haven’t already, follow the official installation instructions.
Once installed, set up credentials to allow aws-cli to execute commands within the LocalStack environment:
vim ~/.aws/config
# Add a profile
[profile localstack]
region=us-east-1
output=json
endpoint_url = http://localhost:4566
vim ~/.aws/credentials
# Add credentials
[localstack]
aws_access_key_id=local_access_key
aws_secret_access_key=local_secret_key
Once this setup is complete, run the following command to retrieve information about the Lambda:
aws lambda list-functions \
--endpoint http://localhost:4566 \
--profile localstack
Expected output:
{
"Functions": [
{
"FunctionName": "HelloWorldFunction",
"FunctionArn": "arn:aws:lambda:us-east-1:000000000000:function:HelloWorldFunction",
"Runtime": "provided.al2",
"Role": "arn:aws:iam::000000000000:role/lambda-role",
"Handler": "handler",
"CodeSize": 262,
"Description": "",
"Timeout": 60,
"MemorySize": 128,
"LastModified": "2024-08-31T00:00:00.000000+0000",
"CodeSha256": "CLFtqm1Xy2/1tOH+gsEW2Ik4yerXxIMxUHt7Ct7qbP4=",
"Version": "$LATEST",
"TracingConfig": {
"Mode": "PassThrough"
},
"RevisionId": "520868a9-7949-436f-a032-fc83bd6afe74",
"PackageType": "Zip"
}
]
}
- --endpoint: Specifies the command’s target endpoint.
- --profile: Specifies the profile to use when running the command.
If the Lambda details are output as shown, the setup was successful. You’ve successfully created a Lambda in your local environment.
aws lambda invoke \
--function-name HelloWorldFunction \
--endpoint http://localhost:4566 \
--profile localstack \
output.txt
Expected output:
{
"StatusCode": 200,
"ExecutedVersion": "$LATEST"
}
The result of the execution is saved to output.txt.
cat output.txt
{"statusCode":200,"body":"Hello World!"}
If the output matches what was implemented in the Lambda, it’s a success.
Conclusion
How was that? We’ve successfully set up an AWS environment locally using LocalStack and Terraform. As mentioned at the beginning, this setup can be useful as a testing environment during development or as a learning environment for Terraform and aws-cli.
The source code used in this article can be found on GitHub