Azure Pipelines YAML Templates

This article will talk about YAML templates. By using these templates, you can make your code reusable and simplify sharing code across multiple pipelines.

If you are new to CI/CD, you might be familiar with blocks of code repeated multiple times in a single pipeline or deploying applications in various environments. The problem with code duplication is that block refactors must be propagated to each duplicate. Because of this, both the chance of human error and the time required to develop the pipeline increase linearly as the number of duplicates.

Using YAML templates, you can define reusable content, logic, and parameters in separate templates that load into your pipeline at runtime. You can divide these templates into one of these four categories,

  • Stage - define a set of stages for related jobs
  • Job - define a collection of steps run by an agent
  • Step - define a linear sequence of operations for a job
  • Variable - alternative to hardcoded values or variable groups

Example

Suppose you have a repository that needs to be automated through CI/CD pipelines with the following structure,

.
├── api
│   ├── dockerfile
│   ├── node_modules
│   ├── package-lock.json
│   ├── package.json
│   └── server.js
├── frontend
│   ├── dockerfile
│   ├── node_modules
│   ├── package.json
│   ├── public
│   ├── src
│   └── yarn.lock
└── pipelines
    └── azure-pipelines.yml

Without using a template, you will have to add the tasks to build, test, and deploy the docker image of the Node.JS API. Then you will have to perform the same tasks for the React.JS front-end. The result will be something like this,

trigger:
- mainpool:
  vmImage: 'ubuntu-latest'stages:
- stage: Build
  displayName: Build stage
  jobs:  
  - job: Build_api
    displayName: Build API
    steps:
    - task: NodeTool@0
      inputs:
        versionSpec: '10.x'
      displayName: 'Install Node.js'
    - script: |
        npm install
      workingDirectory: '$(Build.Repository.LocalPath)/api'
      displayName: 'npm install'- job: Build_frontend
    displayName: Build Frontend
    steps:
    - task: NodeTool@0
      inputs:
        versionSpec: '10.x'
      displayName: 'Install Node.js'
    - script: |
        npm install
        npm run build
      workingDirectory: '$(Build.Repository.LocalPath)/frontend'
      displayName: 'npm install and build'- stage: Test
  displayName: Test stage
  jobs:  
  - job: Test_api
    displayName: Test API
    steps:
    - task: SonarCloudPrepare@1
      inputs:
        SonarCloud: 'SonarCloud training connection'
        organization: 'gtrekter'
        scannerMode: 'CLI'
        configMode: 'manual'
        cliProjectKey: 'GTRekter_Training'
        cliProjectName: 'Training'
        cliSources: '$(Build.Repository.LocalPath)/api'
    - task: SonarCloudAnalyze@1
    - task: SonarCloudPublish@1
      inputs:
        pollingTimeoutSec: '300'  - job: Test_frontend
    displayName: Test Frontend
    steps:
    - task: SonarCloudPrepare@1
      inputs:
        SonarCloud: 'SonarCloud training connection'
        organization: 'gtrekter'
        scannerMode: 'CLI'
        configMode: 'manual'
        cliProjectKey: 'GTRekter_Training'
        cliProjectName: 'Training'
        cliSources: '$(Build.Repository.LocalPath)/frontend'
    - task: SonarCloudAnalyze@1
    - task: SonarCloudPublish@1
      inputs:
        pollingTimeoutSec: '300'- stage: Build_dockerimage
  displayName: Build and push image to ACR stage
  jobs:  
  - job: Build_api_dockerimage
    displayName: Build and push API image
    steps:
    - task: Docker@2
      displayName: Build and push an image to ACR
      inputs:
        containerRegistry: 'ACR training connection'
        repository: 'training-frontend'
        command: 'buildAndPush'
        Dockerfile: '$(Build.SourcesDirectory)/api/dockerfile'
        tags: '$(Build.BuildId)'  - job: Build_frontend_dockerimage
    displayName: Build and push frontend image
    steps:
    - task: Docker@2
      displayName: Build and push an image to ACR
      inputs:
        containerRegistry: 'ACR training connection'
        repository: 'training-frontend'
        command: 'buildAndPush'
        Dockerfile: '$(Build.SourcesDirectory)/frontend/dockerfile'
        tags: '$(Build.BuildId)'

The results of running the pipeline will be the following,

Azure Pipelines YAML Templates

Suppose that you must update the pipeline by, for example, adding unit tests or changing the SonarCloud connection. If you aren’t using a template, you must repeat the same operation for each duplicate.

By using YAML templates, you can split the pipeline into multiple, more manageable components stored in different files. The structure of the project would then be the following,

.
├── README.md
├── api
│   ├── dockerfile
│   ├── package-lock.json
│   ├── package.json
│   └── server.js
├── frontend
│   ├── dockerfile
│   ├── node_modules
│   ├── package.json
│   ├── public
│   ├── src
│   └── yarn.lock
└── pipelines
    ├── build-template.yml
    ├── docker-template.yml
    ├── pipeline.yml
    └── sonarcloud-template.yml

In this case, the pipeline would be,

trigger:
- mainpool:
  vmImage: 'ubuntu-latest'stages:
- stage: Build
  displayName: Build stage
  jobs:  
  - job: Build_api
    displayName: Build API
    steps:
      - template: ./build-template.yml
        parameters:
          workingDirectory: '$(Build.Repository.LocalPath)/api'- job: Build_frontend
    displayName: Build Frontend
    steps:
      - template: ./build-template.yml
        parameters:
          workingDirectory: '$(Build.Repository.LocalPath)/frontend'
          buildProject: true- stage: Test
  displayName: Test stage
  jobs:  
  - job: Test_api
    displayName: Test API
    steps:
      - template: ./sonarcloud-template.yml
        parameters:
          workingDirectory: '$(Build.Repository.LocalPath)/api'- job: Test_frontend
    displayName: Test Frontend
    steps:
      - template: sonarcloud-template.yml
        parameters:
          workingDirectory: '$(Build.Repository.LocalPath)/frontend'- stage: Build_dockerimage
  displayName: Build and push image to ACR stage
  jobs:  
  - job: Build_api_dockerimage
    displayName: Build and push API image
    steps:
      - template: ./docker-template.yml
        parameters:
          workingDirectory: '$(Build.Repository.LocalPath)/api/dockerfile'- job: Build_frontend_dockerimage
    displayName: Build and push frontend image
    steps:
      - template: ./docker-template.yml
        parameters:
          workingDirectory: '$(Build.Repository.LocalPath)/frontend/dockerfile'

As you can see, each of the steps dedicated to a particular activity is collected in a dedicated file.

build-template.yml

parameters:
- name: workingDirectory 
  type: string 
  default: ''
- name: buildProject
  type: boolean
  default: falsesteps:
- task: NodeTool@0
  inputs:
    versionSpec: '10.x'
  displayName: 'Install Node.js'
- script: |
    npm install
  workingDirectory: '${{ parameters.workingDirectory }}'
  displayName: 'npm install'
- script: |
    npm install
  workingDirectory: '${{ parameters.workingDirectory }}'
  condition: ${{ parameters.buildProject }}
  displayName: 'npm build'

sonarcube-template.yml

parameters:
- name: workingDirectory 
  type: string 
  default: ''steps:
- task: SonarCloudPrepare@1
  inputs:
    SonarCloud: 'SonarCloud training connection'
    organization: 'gtrekter'
    scannerMode: 'CLI'
    configMode: 'manual'
    cliProjectKey: 'GTRekter_Training'
    cliProjectName: 'Training'
    cliSources: '${{ parameters.workingDirectory }}'
- task: SonarCloudAnalyze@1
- task: SonarCloudPublish@1
  inputs:
    pollingTimeoutSec: '300'

docker-template.yml

parameters:
- name: workingDirectory 
  type: string 
  default: ''steps:
- task: Docker@2
  displayName: Build and push an image to ACR
  inputs:
    containerRegistry: 'ACR training connection'
    repository: 'training-api'
    command: 'buildAndPush'
    Dockerfile: ${{ parameters.workingDirectory }}'
    tags: '$(Build.BuildId)'

The results of running the pipeline will be the following,

Azure Pipelines YAML Templates

Limitations

Azure pipelines support 100 separate template files in a single pipeline. It supports a maximum of 20 template nesting levels.

Wrap Up

In conclusion, YAML templates will keep your pipeline simpler to edit. This allows you to focus on application-specific tasks. However, be aware that a simple change to a template will affect all pipelines that use that template.

Resources


Similar Articles