YAML for Build and Release Azure DevOps Pipelines

YAML

YAML - YAML Ain't Markup Language

YAML is a data serialization language like JSON, XML, and BSON that is designed to be human-friendly and works with all programming languages.

Yaml is commonly used to create configuration files in Infrastructure as Code(Ioc) programs or to execute/create automation protocols that can execute a series of commands listed in a YAML file. It also helps in developing configuration files, log files, and data sharing among cross-language.

YAML is more human-readable than other data serialization languages. The below image shows the difference in structure between JSON and YAML.

JSON and YAML

The basic syntax for YAML is given below.

Scalers

Scalars are ordinary values: numbers, strings, booleans.

number-value: 4
floating-point-value: 3.14
boolean-value: true 
# strings can be both 'single-quoted` and "double-quoted" or not quote
string-value: "any"

YAML has a number of special characters you cannot use in unquoted strings: ` [] {} : > | `. You should quote all strings that contain any of the following characters the following way

special-characters: "[all] {quoted} :like> this|"

Lists

Lists are collections of elements. Every element of the list is indented and starts with a dash and a space.

name
  - abc
  - def
  - ghi
  - jkl

Dictionaries

Dictionaries are collections of key, value mappings. All keys are case-sensitive.

Detail:
  name: Dharmesh
  home-planet: Earth
  species: human
  weight: 60kg 
  height: 150cm

Note that a space after the colon is mandatory.

Dictionaries can be listed with a list also.

requests:
  # first item of `requests` list is just a string
  - http://example.com/
 
  # second item of `requests` list is a dictionary
  - url: http://example.com/
    method: GET

Inline Syntax for Lists and Dictionaries

episodes: [1, 2, 3, 4, 5, 6, 7]
detail: {name: Dharmesh, hobby: cricket} 

YAML Multi Documents

YAML format allows multiple documents to be embedded in a single file. They only have to be separated with a line containing a triple-dash separator ---.

document: this is document 1
---
document: this is document 2

When reading multi-document YAML, Taurus will treat multiple documents as multiple configs and will load them one by one.

YAML Debugging tips

There are a number of tools you can use to help you locate and fix syntactical errors in your YAML document.

  1. You can use online services to convert it to JSON to check the structure (can be useful if you don’t have much experience with indentation-based languages)
  2. You can use yamllint to see if there are any errors or issues with your document.

Azure DevOps YAML Pipeline

In Azure DevOps, we can use YAML to create multistage pipelines instead of creating two pipelines for build and release. Both CI and CD can be done in one pipeline using YAML.

  1. CI: Used to automate builds and tests for the project.
  2. CD: Used to automatically deploy/test code.

What are Azure Pipelines?

  1. It is a Service(part of DevOps) that provides CI/CD capabilities.
  2. Works with most languages and platforms
  3. Can be deployed to most major cloud services like Azure
  4. Supports both Classic editing(GUI) and YAML

What tools/languages are supported in Azure pipelines?

  1. Supports most languages: python, java, javascript, Ruby, c#
  2. Version Control system: Github, GitHub Enterprise, Azure Repos, Bitbucket cloud, SubVersion.
  3. Deploy code to multiple targets: Container registries, virtual machines, on-premise environments, and cloud environments.

Here I have added the most commonly used YAML scripts for the DevOps pipeline

TimeOut Pipeline

The below pipeline ends after 1 minute and outputs only "Going to Sleep" and then results in a Timeout error.

trigger
- main

pool:
  name: Default
  demands: agent.name -equals COB-C-00389

stages:
  - stage: myStage
    jobs:
     - job: A
       timeoutInMinutes: 1
       steps:
        - task: PowerShell@2
          inputs:
            targetType: 'inline'
            script: |
              Write-Output "Going to Sleep"
              Start-Sleep -s 180
              Write-Output 'Waking' 
  • trigger: Trigger the pipeline when there is a commit or main branch or specify null for manual trigger
  • pool: agent(software in the cloud service) which will run our pipeline. We can use our own local computers or virtual machines.
  • stages: specify the build/test/dev stage for the pipeline. One stage can depend on other stages. The stage contains jobs and the job contains steps.

MultiStage PipeLine

Here stages B and C depend on A and stage D depends on both B and C. B and C run after stage A and D will run after B and C.

trigger: none

pool:
  name: Default
  demands:
   - agent.name -equals COB-C-00389

stages:
  - stage: A
    jobs:
     - job: A1
       steps:
        - script: echo A1
     - job: A2
       steps:
        - script: echo A2
  - stage: B
    dependsOn: A
    jobs:
     - job: B1
       steps:
        - script: echo B1
     - job: B2
       steps:
        - script: echo B2
  - stage: C
    dependsOn: A
    jobs:
     - job: C1
       steps:
        - script: echo C1
     - job: C2
       steps:
        - script: echo C2
  - stage: D
    dependsOn: 
     - B
     - C
    jobs:
     - job: D1
       steps:
        - script: echo D1
     - job: D2
       steps:
        - script: echo D2

Multiconfiguration Jobs Pipeline Using Strategy

It helps in running multiple copies of a job with different inputs.

trigger
- none

pool:
  name: Default
  demands: 
   - agent.name -equals COB-C-00389

stages:
  - stage: MyStage
    jobs:
     -  job: MyJob
        strategy:
          maxParallel: 0
          matrix:
            JOB1:
              version: 1.0
              AppName: foo
            JOB2:
              version: 2.0
              Appname: bar
            JOB3:
              version: 3.0
              AppName: foobar
        steps:
         - script: |
             echo hello $(version)
             echo hai $(AppName):

Selecting the Stages to run using parameters

The parameters are used to determine whether to run these stages. You could add expressions before the stage to check if the parameter value could meet the expression.

The default value is false. This means that the stage will not run by default.

trigger
- main

pool:
  vmImage: ubuntu-latest

parameters:
  - name: Development
    displayName: Development
    type: boolean
    default: false
  - name: Quality
    displayName: Quality
    type: boolean
    default: false
  - name: Production
    displayName: Production
    type: boolean
    default: false

stages:
 - ${{ if eq(parameters.Development, true)}}:
   - stage: Build
     jobs:
      - job: Build
        steps:
         - powershell: |
              echo "hello"
              Start-Sleep -Seconds 10
              echo "Dev"

 - ${{ if eq(parameters.Quality, true)}}:
    - stage: Quality
      dependsOn: []
      jobs:
      - job: Quality
        steps:
          - script: echo Quality
  
    - stage: Production
      dependsOn: []
      jobs:
        - job: Prodcution
          steps:
            - script: echo Production:

Using Inbuilt Tasks

The Azure Pipelines have the inbuilt tasks for building, importing, exporting, testing, packaging, deploying, etc. Hence writing YAML for the pipeline is very simple.

Here is the YAML pipeline for moving solutions from one Dynamics 356 environment to another using inbuilt tasks.

Inbuilt Tasks

name: $(BuildDefinitionName)-$(Date:yyyyMMdd).$(Rev:.r

trigger:
- none

pool:
  vmImage: vs2017-win2016

stages:
  - stage: build
    displayName: Build 
    jobs:
      - job: Build
        steps:
        - script: md tools
          displayName: 'Create tools directory'  
        - task: MSCRMToolInstaller@12
          inputs:
            nugetFeed: 'official'
            psFeed: 'official'
        - task: MSCRMPackSolution@12
          inputs:
            unpackedFilesFolder: 'Test/solution'
            packageType: 'Both'
            outputPath: 'output'
          displayName: 'Solution Packager: pack solution' 
        - task: PublishBuildArtifacts@1
          inputs:
            PathtoPublish: 'output'
            ArtifactName: 'drop'
            publishLocation: 'Container'
  - stage: Deploy
    dependsOn: Build
    condition: succeeded('Build')
    jobs:
     - job: Deploy
       steps:
       - task: DownloadBuildArtifacts@1
         inputs:
           buildType: 'current'
           downloadType: 'single'
           artifactName: 'drop'
       - task: MSCRMToolInstaller@12
         inputs:
           nugetFeed: 'official'
           psFeed: 'official'
       - task: MSCRMImportSolution@12
         inputs:
           crmConnectionString: 'AuthType=OAuth;Username=*****@*****.onmicrosoft.com;Password=******;Url=https://dharmicrm1prod.crm.dynamics.com/;AppId=*****;RedirectUri=*****;LoginPrompt=Never'
           solutionFile: '$(System.ArtifactsDirectory)\temp\drop\Test_managed.zip'
           override: true
         displayName: Import to Q System
       
      )

Thank You!


Similar Articles