Over the past months in my cloud and DevOps journey, I’ve constantly been looking for ways to move beyond theory and gain real, practical, hands-on experience. As an AWS Community Builder , I’ve had the opportunity to learn from industry experts, collaborate with passionate builders, and explore how real-world cloud environments are automated and managed at scale.
While working with AWS services, I realized that manually creating resources in the console is fine for learning but it doesn’t scale in real DevOps environments. That’s when I decided to start my Terraform journey.
I wanted to understand how automation to build, deploy, and manage infrastructure efficiently. Instead of creating cloud resources manually in the AWS Console, Infrastructure as Code (IaC) allows us to define infrastructure in reusable configuration files – making deployments faster, repeatable, and less error-prone.
In this blog, I used Terraform to automate the deployment of an AWS S3 bucket and upload an object to it. I will demonstrate how to install and set up Terraform, configure AWS credentials securely via the terminal, and use Terraform to create and manage S3 buckets. The goal is to automate the provisioning of AWS S3 resources and upload files directly to S3 using Terraform configurations, ensuring efficient and repeatable infrastructure deployment. This was a great hands-on experience that strengthened my understanding of cloud automation and DevOps workflows.
Download and Install Terraform
- Head to the official Terraform download page: https://developer.hashicorp.com/terraform/downloads
🍎 MacOS: Select the Apple icon from the top left hand corner of your computer’s menu bar. Select About this Mac , and note whether your Chip says Apple (pick ARM64) or Intel (pick AMD64).
🖼️ Windows: Select the Start button and search for System Information. Note whether your System Type says x64-based or ARM-based PC.
🐧 Linux: Open a terminal and run uname -m. Match the output to the correct Linux package option.
What I Set Out to Do
I demonstrated how to:
- Install and configure Terraform
- Configure AWS credentials securely using AWS CLI
- Create and manage an S3 bucket through Terraform
- Upload a file to S3 using Terraform
- Destroy resources safely after use
The goal was to automate AWS resource provisioning instead of doing it manually, following real-world DevOps practices.
What is Terraform?
Terraform is an Infrastructure as Code (IaC) tool that lets you build and manage cloud infrastructure using configuration files instead of creating resources manually. You define what you want, such as S3 buckets, servers, networks, or databases, and Terraform automatically provisions them for you based on that blueprint.
What is Infrastructure as Code (IaC)?
Infrastructure as Code means managing infrastructure using code rather than manual setup in the cloud console. Because everything is defined in files, deployments are:
- consistent
- repeatable
- easy to rebuild or modify
- less prone to human error
Terraform is one of the most popular IaC tools because it supports multiple cloud providers like AWS, Azure, and Google Cloud , making it ideal for scalable and multi-cloud environments.
What is a Terraform binary?
A Terraform binary is the main executable file that lets you run Terraform commands on your system. It’s the core program your operating system uses to execute Terraform.
What is PATH?
PATH is a list of system folders where your computer looks for programs when you run a command in the terminal. It lets you run tools like terraform without typing their full file location.
Installing Terraform
Terraform is a powerful IaC tool that allows you to define infrastructure using configuration files.
🍎🐧 MacOS / Linux — Add Terraform to PATH
- Open Terminal and navigate to the unzipped Terraform folder, e.g.:
cd ~/Downloads/terraform_1.10.x_xxx
- (Optional) Confirm location:
pwd
- Move the Terraform binary to /usr/local/bin:
sudo mv terraform /usr/local/bin/
- Enter your system password if prompted.
You’ve just installed Terraform on your local computer. Now we’ll need to verify we’ve installed it correctly.
🖥️ Windows — Add Terraform to PATH
- Open System Properties → Advanced → Environment Variables
- Under System Variables , select PATH → Edit
- Click New and add:
C:\terraform
- Save changes and open a new terminal session.
After installation, verify Terraform Installation
- Run the following command:
terraform version
✔️ Terraform was successfully installed.
📁 Setting Up the Terraform Project
I created my project directory and configuration file:
mkdir ~/Desktop/DD_terraform
cd ~/Desktop/DD_terraform
touch main.tf
What is main.tf?
main.tf is the primary Terraform configuration file where you define how your infrastructure should be built. The main.tf file acts as the blueprint of the infrastructure Terraform will build.
How does Terraform use configuration files?
Terraform uses configuration files (like main.tf) to describe the desired state of your infrastructure and then automatically creates or updates resources to match that state.
I will create a project directory and set up the main Terraform file (main.tf) because it organizes the infrastructure code and defines the resources to be provisioned. This structure makes the project clear, reusable, and easier for Terraform to manage and apply changes.
🧱 Defining the AWS S3 Bucket in Terraform
Inside main.tfI defined the AWS provider and S3 bucket:
- Open main.tf in a text editor like VS Code, NotePad++ or TextEdit.
- Copy and paste the following code into main.tf
provider "aws" {
region = "us-east-2 #update the region closest to you
}
resource "aws_s3_bucket" "my_bucket" {
bucket = "unique-bucket-{yourname}-{random number}" #Ensure the bucket name is unique
}
resource "aws_s3_bucket_public_access_block" "my_bucket_public_access_block" {
bucket = aws_s3_bucket.my_bucket.id
block_public_acls = true
ignore_public_acls = true
block_public_policy = true
restrict_public_buckets = true
}
Bucket names must be globally unique , so I customized the bucket name accordingly.
What’s in this Terraform configuration?
- provider “aws” — Tells Terraform to use AWS as the cloud provider and enables it to interact with AWS services.
- resource “aws_s3_bucket” “my_bucket” — Creates an S3 bucket, with my_bucket used as a reference name inside the configuration.
- resource “aws_s3_bucket_public_access_block” — Blocks public access to the bucket by enabling security settings that prevent public ACLs and policies.
How are configuration files structured in Terraform?
Terraform configuration files are structured in blocks , where each block defines a specific part of the infrastructure.
- Provider blocks specify which cloud or platform (AWS, Azure, GCP, etc.) Terraform will use.
- Resource blocks define the actual resources to create or manage, like S3 buckets, servers, or networks.
The configuration is structured in provider and resource blocks, where each block defines a specific part of the infrastructure. This keeps the code modular, readable, and easy to maintain. For example, the provider block connects to AWS, while the S3 resource block defines the bucket.
- Visit the Terraform registry’s documentation on S3 buckets.
What is the Terraform registry?
The Terraform Registry is an online library of Terraform providers and modules. It contains documentation, examples, best practices, and configuration options. Developers can also publish modules there, allowing others to reuse them instead of writing everything from scratch.
What can I find in this aws_s3_bucket page?
The aws_s3_bucket page in the Terraform documentation explains how to create and manage S3 buckets. It includes configuration options, properties, examples, and details on how the bucket integrates with other AWS services.
I visited the Terraform documentation to understand how to use the aws_s3_bucket resource. It explains how to configure S3 buckets using arguments like bucket name, ACLs, and tags, and provides examples and best-practice guidance, helping me customize and extend my project confidently.
- From the documentation, choose an additional configuration that you might want to add to your S3 bucket or the object you’ve uploaded.
- Write your configuration in main.tf
- Save your changes to main.tf
I customized my S3 bucket by adding tags to help identify and organize resources. I used tags Environment = "Dev" to label its purpose and environment. I verified this customization in the AWS S3 console by checking the Tags section, where these labels appear, making the bucket easier to manage and filter in larger environments.
⚙️ Initializing and Planning the Terraform Configuration
I will initialize Terraform and run a plan because initialization sets up the project and downloads the required provider plugins, while the plan previews the changes Terraform will make in AWS. This helps verify the configuration before applying it and reduces the risk of errors or unexpected changes.
I initialized Terraform using:
terraform init
This downloaded the AWS provider plugin and prepared the project.
Next, I previewed the changes Terraform would make:
terraform plan
This execution plan showed which resources would be created before anything was applied.
I ran terraform plan to generate an execution plan that previews the changes Terraform will make to my AWS infrastructure based on the configuration in main.tf. It shows which resources will be created, updated, or deleted, allowing me to review the configuration safely before applying any real changes.
you get an error?
When I tried to plan my Terraform configuration, I received an error message that says the AWS provider couldn’t authenticate because my credentials were missing or incorrect. This happened because I hadn’t configured the AWS CLI properly or used invalid access keys. Terraform needs valid credentials to connect to AWS and manage resources, so the plan command fails without proper authentication setup.
🔑 Configuring AWS CLI and Credentials
To allow Terraform to authenticate with AWS, I installed AWS CLI and verified it:
aws --version
I will install the AWS CLI , generate access keys, and configure the CLI so Terraform can securely authenticate with my AWS account. This ensures Terraform has the required permissions to create and manage resources like S3 buckets during deployment.
Then configured my credentials:
aws configure
I entered:
- Access Key ID
- Secret Access Key
- AWS Region
- Output format (JSON)
This enabled secure programmatic access for Terraform.
🪣 Creating the S3 Bucket with Terraform
Once everything was configured, I deployed the infrastructure:
terraform apply
I will apply my Terraform configuration because this command takes the planned changes and actually creates the defined infrastructure in this case, an S3 bucket on AWS. This is the step where Terraform provisions resources, after which I will verify the bucket in the S3 console to ensure it was created as defined in main.tf.
After reviewing the plan, I confirmed with 'yes', and Terraform created the S3 bucket successfully.
I ran terraform apply to execute the changes in my configuration and provision resources like the S3 bucket. This command creates, updates, or deletes resources in my AWS account based on main.tf, and it confirms the plan before applying changes turning the code into real infrastructure.
Seeing the bucket appear in AWS was a very satisfying moment 🙂
💎 Uploading an Object to S3 Using Terraform
For the project’s, I uploaded a file to the bucket using Terraform.
I will update my Terraform configuration, reapply it, and document the changes to demonstrate how Terraform detects and applies infrastructure updates. By modifying settings like tags or versioning, I can show how changes are managed safely and reliably using Infrastructure as Code.
Add a New Image File
- Download this image. To download it as an image, you can right click on this link and select Save Link As…
I placed image.png inside my project folder and added:
resource "aws_s3_object" "image" {
bucket = aws_s3_bucket.my_bucket.id
key = "image.png"
source = "image.png"
}
Then I reapplied the configuration:
terraform plan
terraform apply
We need to run terraform apply again because any changes made to the configuration must be applied to update the actual infrastructure. Terraform compares the current state with the new configuration and applies only the detected changes ensuring the deployed environment stays consistent with the code. Terraform detected the new resource and uploaded the file automatically.
To validate my update, I confirmed that terraform apply completed successfully, then checked the S3 console and verified that the image.jpg file appeared in my bucket. This confirmed that the Terraform changes were applied correctly and reflected in my AWS environment.
I validated the upload by downloading and viewing the file inside AWS S3.
🧹 Cleaning Up — Destroying Resources
To avoid keeping unused resources, I removed the infrastructure using:
terraform destroy
Terraform deleted:
- The S3 bucket
- The uploaded object
I also deleted the IAM access key created for the project following security best practices.
📚 Key Concepts I Learned
This project helped me strengthen my understanding of:
✔️ Infrastructure as Code (IaC)
✔️ Terraform providers and resource blocks
✔️ Terraform lifecycle
init → plan → apply → destroy
✔️ State files and execution plans
✔️ Secure AWS authentication via CLI
✔️ S3 storage automation
I also experienced how Terraform enables:
- repeatable deployments
- reduced manual effort
- version-controlled infrastructure
- safer cloud operations
GitHub link: https://github.com/dineshrajdhanapathyDD/Terraform-series.git
Conclusion
Terraform is a highly practical and in-demand skill for roles such as DevOps Engineer, DevSecOps Engineer, Platform Engineer, and Cloud Engineer. Through this project, I gained hands-on experience in real-world cloud automation from configuring AWS credentials to provisioning and managing S3 buckets using Infrastructure as Code.
I worked with AWS S3 for storage and AWS CLI for authentication, and strengthened key Terraform concepts including providers, resource blocks, and state files. I also practiced the full Terraform workflow initialization, planning, applying changes, and updating infrastructure safely and consistently.
This experience has motivated me to continue expanding my Terraform journey into areas like EC2 provisioning, VPC and networking automation, bucket versioning and lifecycle rules, and remote Terraform state management.
Overall, this project was a great learning experience and a meaningful step forward in my cloud and DevOps journey
Thank you for taking the time to read my article! If you found it helpful, feel free to like, share, and drop your thoughts in the comments, I’d love to hear from you.
If you want to connect or dive deeper into cloud and DevOps, feel free to follow me on my socials:
👨💻 DEV Community
🐙 GitHub
















Top comments (0)