Creating Terraform workflow with Gitlab CI using the feature Parent-child pipelines

Raphael Moraes
Webera
Published in
4 min readFeb 7, 2022

--

Hello, I’m back again, but now to share a super cool way to improve Gitlab CI pipelines. Sometimes it’s hard to visualize the workflow due to many jobs, mainly when you are using Gitlab to create a Terraform workflow to run against multiple environments. Even though that Gitlab has the Terraform integration feature, I adapted using some features, such as the terraform integration in the merge requests. Still, some essential principles and best practices to work with Terraform are not covered in the Gitlab CI documentation, which I will cover in this article through the Parent-child pipeline feature.

NOTE: The feature Parent-child pipeline was introduced in GitLab 12.7 and is available for All Tiers.

Terraform Workflow configuration

The first step is creating the repository where the templates will reside:

https://mygitlab.example.com/infrastructure-as-code/templates/cicd

The project will have the following folder structure:

├── CODEOWNERS
├── jobs
│ └── terraform-prepare.yml
├── README.md
└── workflows
├── Terraform.gitlab-ci.yml

  • The text file CODEOWNER is used to control who has permission to modify or delete anything in the project.
  • The text file README.md normally contains important instructions about the project and also for explaining how anything there will work.
  • Folder “jobs” is where the jobs/scripts reside.
  • Folder “workflows” is where the templates reside.

Follow below the scripts (external YAML files to be included in the Parent CICD pipeline later) that will reside in the folder “jobs”:

terraform-prepare.yml

And below, inside the folder “workflows”, is the template file that will be included in the Parent CICD pipeline:

Terraform.gitlab-ci.yml

NOTE: That command at line 67 will be responsible to generate a report that will be attached during the merge request, where will be possible to see the following information about the changes:

More details here.

The second step is creating the repository where the Parent pipeline will reside:

https://mygitlab.example.com/infrastructure-as-code/parent-pipeline

Which will have the following folder structure:

├── CODEOWNERS
├── dev
├── staging
├── production
├── .gitlab-ci.yml
└── README.md

  • The text file CODEOWNER is used to control who has permission to modify or delete anything in the project.
  • The text file README.md normally contains important instructions about the project and also for explaining how anything there will work.
  • The YAML file .gitlab-ci.yml contains the CI/CD configuration.
  • Folder “dev”, “staging”, and “production” is where the main Terraform scripts reside.

Follow below the content of the file “.gitlab-ci.yml”:

Below what means those variables in each job:

  • TF_ENV_PATH: Terraform environment folder where the terraform commands will be executed.
  • IMAGE_TAG: The docker image tag for the Terraform CLI version that will be used to execute the workflow. I extremely recommend locking the version of the client and the version of every Terraform resource that you are going to use. In this way, you will control the workflow, knowing what arrangement will work adequately and apply your code. If you decide to bump the version in this way, you can create a new TAG with a version most recently for testing, and if successful, you can then change the Terraform CLI version. But in case of a failure, you can roll back to the previous version fastly. Remember, you should prefer to use specific versions instead of the latest versions.
  • AWS_ACCOUNT_ID: Your AWS Account ID. It will be used in that script terraform-prepare.yml above, to assume a specific IAM role.
  • AWS_IAM_ROLE: The IAM role to programmatic access, in this case for Terraform. This role should have specific privileges for applying your code using Terraform CLI. I recommend following the standard security advice of granting the least privileges.
  • AWS_REGION: The AWS region where you are going to apply your code for your infrastructure.

Now I will finish this article explaining the code that we are using in the parent .gitlab-ci.yml file to trigger a child pipeline. This is where the magic happens:

Inside the “trigger:include” keyword, you will call an external pipeline, as you can see in the image above. Inside the include section, you will point to the project where the external pipeline, or in our case here, where the child pipeline resides, and specify the branch you want to use. Last but certainly not least, the keyword “strategy” will force the trigger job to wait for the downstream pipeline to complete before it is marked as success.

That’s all folks!! =)

For more details about the Parent-child pipeline feature, click here.

NOTE: I’m using DynamoDB as a locking mechanism to remote storage backend S3 to store my TF State files.

Thank you for reading the article, and remember:

— “knowledge acquired but not shared is lost knowledge” — .

--

--