Terraform with AWS(including EFS instead of EBS) Full Automation

Terraform with AWS(including EFS instead of EBS) Full Automation

What is Terraform ?

Terraform is a tool for building, changing, and versioning infrastructure safely and efficiently. Terraform can manage existing and popular service providers as well as custom in-house solutions.

Configuration files describe to Terraform the components needed to run a single application or your entire datacenter. Terraform generates an execution plan describing what it will do to reach the desired state, and then executes it to build the described infrastructure. As the configuration changes, Terraform is able to determine what changed and create incremental execution plans which can be applied.

The infrastructure Terraform can manage includes low-level components such as compute instances, storage, and networking, as well as high-level components such as DNS entries, SaaS features, etc.

No alt text provided for this image


What is AWS ?

Amazon Web Services (AWS) is a subsidiary of Amazon that provides on-demand cloud computing platforms and APIs to individuals, companies, and governments, on a metered pay-as-you-go basis. In aggregate, these cloud computing web services provide a set of primitive abstract technical infrastructure and distributed computing building blocks and tools. One of these services is Amazon Elastic Compute Cloud (EC2), which allows users to have at their disposal a virtual cluster of computers, available all the time, through the Internet.

No alt text provided for this image


Problem Statement of Task:

Create/launch Application using Terraform

1.Create the key-pair and security group which allow the port 80.

2. Launch an EC2 instance. In this EC2 instance use the key and security group which we have created in step 1.

3. Launch one storage volume (EFS) and attach that volume into the EC2 instance launched & mount the directory.

4. Get the code uploaded by the developer in GitHub and copy the code in the /var/www/html folder for deployment.

5. Create S3 bucket, and copy/deploy the static images into the S3 bucket and change the permission to public readable.

6 Create a CloudFront using S3 bucket(which contains images) and use the CloudFront URL to update in code.

7. Launch the application for testing from the code itself.

Here is the procedure to solve this task

STEP 1: Upload Webpage content in central repository GitHub

No alt text provided for this image

Step 2: Configuration of AWS

In this step I configure my IAM user of AWS in CLI.

No alt text provided for this image

We can confure it by giving our Access Key ID & Secret Access key of IAM User.

From now onwards, we are going to use Terraform for further proceedings and we save terraform file with .tf extension:

STEP 3: Setup Cloud Provider

provider "aws" {
      region  = "ap-south-1"
      profile = "nikhil"
}

Here nikhil is the IAM user name & ap-south-1 region is Mumbai region.

STEP 4: Create a S3 Bucket & upload a files in that bucket from GitHub

//Creating S3 bucket.

resource "aws_s3_bucket" "task2bucket1" {
  bucket = "task2bucket1"
  acl    = "public-read"

tags = {
    Name = "task2bucket1"
    
  }
}


//Uploading image to bukcet.  

resource "aws_s3_bucket_object" "file_upload1" {
  bucket = aws_s3_bucket.task2bucket1.bucket
  key    = "aws.jpg"
  source = "aws.jpg"
  content_type = "image/jpg"
  acl = "public-read"

}

STEP 5 : Create a Cloud Front & connect it with the same S3 bucket.

No alt text provided for this image

STEP 6: Create Key Pairs

No alt text provided for this image

STEP 7 : Create Security Group

In this step,we will create our own customised security group which only allows Webserver on port 80 & SSH on port 22.

No alt text provided for this image

STEP 8: Create EC2 instance

in this step,we launch a instance with Redhat linux 8 AMI and by using our own created key pair & security pair.

No alt text provided for this image

And that instance we connect with SSH by help of the key.

No alt text provided for this image

And we run a local executors which runs some commands as per our need.

No alt text provided for this image

STEP 9: Print & Save Public IP of instance

No alt text provided for this image
No alt text provided for this image

STEP 10: Create an EFS Volume

In this step we craete an EFS volume & then attch it to our instance.

And then we mount the directory of instance to EFS volume.

No alt text provided for this image

STEP 11 : Initialise the Provider

In this step we will initialise the provider & download all the plugins that we require in this .tf file by command given below.

terraform init
No alt text provided for this image

STEP 12 : Validate Terraform code

In this step we will validate our terraform by the given command.

terraform validate
No alt text provided for this image

STEP 13 : Destroy Terraform Environment

In this step we will destroy all the previous data of terraform by given command.

terraform destroy
No alt text provided for this image

STEP 14 : Execute Terraform Code (Final Step)

In this step we will run the terraform code by the given command.

terraform apply -auto-approve

Here we use -auto-approve which will automatically give permission to apply changes to terraform.

After successful execution it will open my website from the browser.

No alt text provided for this image

Output Images

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

Thanks for reading this article.

No alt text provided for this image






















To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics