Creating Azure DevOps Pipelines using Terraform

Harinderjit Singh
ITNEXT
Published in
9 min readJul 30, 2023

--

Objective

After my last post about An Azure DevOps Pipeline to manage Azure resources using Terraform, I want to go a step further and build the Azure DevOps Project, Repository and Pipelines etc. using Terraform. That means we can manage the Azure DevOps resources using Infrastructure as Code and can create Pipelines to manage Pipelines (and almost all other Azure DevOps components).

We will use “azuredevops” provider to write the basic modules to create Azure DevOps project, environment, code repositories, pipelines, variable groups etc.

We will use two modules “azdo_project” and “azdo_pipeline” from this GitHub repository to implement pure “Pipeline as Code”.

Although its optional, Before you proceed further I highly recommend you to go through my previous post to understand the difference between manual and Terraform approach to create Azure DevOps resources.

Prerequisites

Create a personal access token

  1. Go to your Azure DevOps.
  2. Click the icon next to your profile icon at the right top corner.
  3. Select “Personal access tokens”.
  4. Click “New Token” then create a new personal access token with the access required by your template. This will be driven primarily based on which resources you need to provision in Azure DevOps. A token with Full access scope will work but may provide more access than you need.
  5. Copy the personal access token.
  • It will be used by Terraform client to authenticate against your Azure DevOps organization
  • It will be also be used on agent configuration on your machine for authentication.

Add SSH Key

  • We need to have an authentication setup for your laptop where you will commit the code. For that, we will create an SSH key (or use an existing one) and update the public SSH key to the Security config in Azure DevOps.
  • Navigate to user settings > Security > SSH Public Keys
  • Click on New Key and paste the public key from your Laptop

Create Storage account for Terraform state

  • Manually create a storage account in your Azure Subscription to serve as backend for Terraform state for Azure DevOps resources.
  • The storage account to serve as Backend for Terraform state for Azure resources (Infrastructure) will be created by “tf-infra-pipeline” later.

Create a Blob container

  • Manually create a Blob container is the storage account for Terraform state for Azure DevOps resources.

Update config_azdo.yaml

#on your machine
cd
git clone https://github.com/harinderjits-git/azdo-pipelines.git
  • We need to make changes to configuration in azdo-pipelines/terraform/azure/terragrunt/azdo/config_azdo.yaml
  • All attributes which need to be updated as per your environment has a comment “#replace this value”
  • azdo_remote_state attributes must match the values as per Storage Account you created in last step.
  • Update the variables` attribute “equalto” as per your environment.
  • Save the changes.

Update set-env.sh

  • Update the below environment variables at azdo-pipelines/terraform/azure/terragrunt/set-env.sh
  • These environment variables are required by Terraform to authenticate to your Azure DevOps organization.
  • Save the changes

Creation of Azure DevOps Resources

Below tools must be installed on your machine before you proceed.

Prerequisites

First time terraform code is executed from your machine.

az login

cd azdo-pipelines/terraform/azure/terragrunt
. ../set-env.sh
make apply-azdo-all

The Resources created by above operation

Project

  • Azure DevOps Project called “gitops_project” is created
  • Created using “azdo_project” Terraform module.

Repositories

  • Two repositories “tf-azdo-repo” and “tf-infra-repo” are created.
  • “tf-azdo-repo” imports contents of a GitHub repo azdo-pipelines.
  • “tf-infra-repo” imports contents of a GitHub repo gitops-azdevops.
  • Created using “azdo_project” Terraform module.
  • gitops_project repository is created by default when the project is created.

Pipelines

  • Two pipelines i.e. “tf-azdo-pipeline” and “tf-infra-pipeline” are created.
  • “tf-azdo-pipeline” pipeline depends on the “tf-azdo-repo” repository
  • “tf-infra-pipeline” pipeline depends on the “tf-infra-repo” repository
  • Created using “azdo_pipeline” Terraform module.
  • Pipeline variables are also created using “azdo_pipeline” Terraform module.

Environment

  • An environment called “test-approval-check” is created that is used by pipelines.
  • Created using “azdo_project” Terraform module.
  • This is where we can define the approval to be used in pipelines

Variable Group

  • A Variable group called vg1 is created which has variables defined in it.
  • The variables in a Variable group can be accessed by Pipelines.
  • We are not restricting the vg1 to a specific Pipeline in this setup.
  • Created using “azdo_project” Terraform module.
  • This Variable group is visible to all pipelines in the project unless restricted explicitly.
  • The variables created in the Variable group using terraform are all clear text.
  • I tried to have them as secrets but when I chose secret type using azuredevops provider, values were left blank. It seems to be an ongoing Bug.

Agent Pool

  • An Agent pool called “Newlocal” is also created.
  • Created using “azdo_project” Terraform module.
  • There is no agent associated with this at this moment.

So these are the resources which were created by our Terraform code.

Update the Pipeline Environment

  • Navigate to Pipelines > Environments > “test-approval” > Approvals and checks
  • Edit Check and select “Approvals”
  • Update Approvers
  • Configure control options with timeout as 1 hour and click Create.

Update config_azdo.yaml in “tf-azdo-repo”

Update config_azdo.yaml in the Azure DevOps respository “tf-azdo-repo” with all the changes we did earlier on cloned respository on your machine.

You can either clone this repo on your machine and commit and push changes to remote origin (preferred method) or you may update remote respository directly.

Create an agent

  • Agent pool Newlocal is already created.

I am using a local agent since this is a learning environment and I am not patient enough to wait for few days before agent is available for usage.

  • Navigate to project settings.
  • Under Pipelines click on Agent Pools and click on “Newlocal”.
  • Click on add a new agent.
  • Choose the Appropriate OS and follow the instructions on your machine.
hsingh@DESKTOP-424H6PI:~/adevopsagent/agent1$ ./config.sh
Enter (Y/N) Accept the Team Explorer Everywhere license agreement now? (press enter for N) > Y
>> Connect:
Enter server URL > https://dev.azure.com/harinderjitss/
Enter authentication type (press enter for PAT) >
Enter personal access token > ****************************************************
Connecting to server ...
>> Register Agent:
Enter agent pool (press enter for default) > Newlocal
Enter agent name (press enter for ) >
Scanning for tool capabilities.
Connecting to the server.
Successfully added the agent
Testing agent connection.
Enter work folder (press enter for _work) >
2023-07-21 20:47:39Z: Settings Saved.
hsingh@DESKTOP-424H6PI:~/adevopsagent/agent1$ ./run.sh
  • After following the instructions, the agent is running on your machine.

First Execution of “tf-azdo-pipeline” pipeline

This will create, update or delete the Azure DevOps resources.

  • We executed with default variable values.
  • There are no changes as the state matched the desired configuration.
  • So we have a pipeline to create, update or delete the pipelines or other Azure DevOps resources.
  • “tf-azdo-pipeline” pipeline can manage all the Azure DevOps resources as per configuration in config_azdo.yaml

Can “tf-azdo-pipeline” pipeline delete itself?

Yes, if you manually run pipeline with terragrunt_action variable as “destroy”, “tf-azdo-pipeline” running that job will also get destroyed since the job is actually running on an agent. You will not be able to see job information because the pipeline and its corresponding project would have been deleted by then. I tested it 😃.

Run “tf-infra-pipeline” pipeline

This will create, update or delete Infrastructure (Azure resources) in your Azure subscription.

We have done this already in my last post.

I am just repeating it to show what changes if we built the “tf-infra-pipeline” pipeline using Terraform. Actually nothing changes conceptually. Azure DevOps resources that we created and configured manually in last post are already done using “tf-azdo-pipeline” pipeline which reduces toil.

Create an App Registration

  • Your GitOps Pipeline will use this Service principal to connect to Azure APIs to create, update or delete resources.
  • On your Azure portal, Navigate to the Azure Active directory > App Registrations.
  • Click on New registration and let’s name it “pipelineaccess”.
  • Create a client secret for this App registration.
  • Note this secret somewhere safe.
  • Navigate to the subscription on the Azure portal and click on “Access Control (IAM)”.
  • Assign the Owner role on the subscription to the ‘pipelineaccess’ Service Principal.

You may consider the principle of least privilege and find out what role should be given instead of “owner” and use that.

  • So we are all set as far as Service Principal is concerned.

Edit the config YAML file

  • Update the terraform/azure/terragrunt/orchestration/config_env_sampleapp.yaml in “tf-infra-repo” respository
  • There are 6 config items to update.
  • You need to update your Azure subscription ID, tenant ID, your Laptop’s public key, DB password, Storage Account’s name, and Storage Account’s Resource Group Name.
  • Commit this change and push it to the Azure DevOps git repository.
  • Azure resources` configuration is controlled by this YAML file. If and when someone needs to edit some configuration, update is made to this file.

Never push secrets to Git repos (this is a Lab environment), You should rather use Azure Key Vault for secret management.

  • Cancel the execution initiated by commit.
  • The first execution is different because you need to create a storage account as a backend for Terraform statefiles.
  • For the first time, set the Variables “first_iteration” to “true” and “local_state_file” to “true” and click on “Run”
  • Pipeline Succeeded
  • All Stages and steps passed and Azure resources are created.
  • We can update or destroy Azure resources using the “tf-infra-pipeline” pipeline.
  • For pipeline stages’ explanation refer to my last post.

Key Takeaways

  • Using azuredevops Terraform provider we can manage Azure DevOps resources.
  • Creating and managing the Azure DevOps resources using Terraform gives you much more control.
  • We can manage Azure DevOps resources using an Azure DevOps pipeline that depends on Terraform code repository.
  • This gives you much more control on your setup and configuration.
  • Using Terraform to create Azure DevOps resources helps us achieve pure “pipeline as code”.
  • We can easily implement policy using OPA policies when using Terraform as IaC in pipelines which means easier policy enforcement.
  • Found a bug with Variable group where variable of type secret would always have null value.
  • I have not considered security implications in this setup.

Please read my other articles as well and share your feedback. If you like the content shared please like, comment, and subscribe for new articles.

--

--

Technical Solutions Developer (GCP). Writes about significant learnings and experiences at work.