Archive

Archive for the ‘DevOps’ Category

Azure DevOps | Marketplace tasks | A task is missing error though its installed

October 26, 2022 Leave a comment

In my Azure DevOps organization, I got “Power Platform Build Tools” tools installed from Marketplace.

However none of the ‘Power Platform’ tasks were accessible/visible in my pipelines.

While executing an existing pipelines thrown following exception.

A task is missing. The pipeline references a task called 'microsoft-IsvExpTools.PowerPlatform-BuildTools.tool-installer.PowerPlatformToolInstaller'

Reason and Fix:

  • Using ‘Market place tasks’ in Pipelines was disabled for the Organization.
  • From your Azure DevOps portal, go to Organization Settings -> Pipelines -> Settings
  • Make sure “Disable Marketplace tasks” setting is turned off.

🙂

Azure DevOps | Power Platform Build Tools | WhoAmI | Capture Environment ID

Azure DevOps along with Power Platform Build Tools results a healthy Power Platform ALM.

In this article, I will explain how to use Power Platform WhoAmI task and capture the ‘Environment ID’.

What is PowerPlatformWhoAmi task:

  • Verifies a Power Platform environment service connection by connecting and making a WhoAmI request.
  • This task can be useful to include early in the pipeline, to verify connectivity before processing begins.

YAML snippet of WhoAmi:

Following is the YAML snippet which we would need to include in pipeline.

- task: microsoft-IsvExpTools.PowerPlatform-BuildTools.whoami.PowerPlatformWhoAmi@0
  displayName: 'Power Platform WhoAmI'

  inputs: 
#   Service Principal/client secret (supports MFA)
    authenticationType: PowerPlatformSPN
    PowerPlatformSPN: '{My service connection}'
  • This is how the pipeline YAML looks like.
    • Please note that we must have “PowerPlatformToolInstaller” task before executing “PowerPlatformWhoAmi” task.

Output of PowerPlatformWhoAmi:

  • When you run the pipeline, you would get following outcome for ‘PowerPlatformWhoAmi’.
  • In the output, there is this useful ‘Organization Information’ (Highlighted above) like Org ID, Unique Name, Environment ID etc…

Capture the ‘Environment ID’:

  • As we noticed, in the output of PowerPlatformWhoAmi, we could see useful ‘Organization Information’.
  • PowerPlatformWhoAmi output will be persisted in “BuildTools” Environment Variables.
  • To capture the ‘Environment ID’ use this convention $(BuildTools.EnvironmentId).
  • Use ‘echo’ to test the EnvironmentId as shown below.
  • Once you run the pipeline, you should see the EnvironmentId. As $(BuildTools.EnvironmentId) is an Environment variable, you can use in subsequent pipelines as well.

🙂

[Code Snippet] PowerShell | Azure DevOps | Query Variable Group using API

In Azure DevOps, Variable groups store values and secrets that you might want to be passed into a YAML pipeline or make available across multiple pipelines.

Variable Group is a collection of variables with Name and Value pair.

Below is my variable Group ‘BillingSystemDictionary’ with 3 variables in my DevOps organization ‘RajeevPentyala0011’ and project ‘Demo’ (These details are required in PowerShell script).

In DevOps pipeline scenarios, you may need to fetch the variables under a variable group. Following is the PowerShell script which calls DevOps API and fetch variables by variable group name.

    # Form API authentication Header
    $headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
    $headers.Add("Authorization", "Bearer $(System.AccessToken)")
    $headers.Add("Content-Type", "application/json")

    # Pass Variable Group Name and read variables
    $variableGroupName = "BillingSystemDictionary"
    # 'projectid' pattern is https://dev.azure.com/{organization_name}/{projectname}
    $projectid = "https://dev.azure.com/RajeevPentyala0011/Demo"

    $variableGroupGetUrl = "$projectid/_apis/distributedtask/variablegroups?groupName=$variableGroupName*&queryOrder=IdDescending&api-version=6.0-preview.2"

    $queryResponseObject = Invoke-RestMethod -Uri $variableGroupGetUrl -Method GET -Headers $headers
 
    # $queryResponseObject will be of type pscustomobject
    # To fetch the variable value of "isactive"
    $variableName = "isactive"
    $variableValue = $queryResponseObject.value.variables.$variableName.value
    Write-Host "$variableName value is $variableValue"

🙂

Categories: CRM, DevOps Tags: , , ,

Azure DevOps | YAML | Could not load type ‘System.Management.Automation.PSSnapIn’

I got the following exception while executing a PowerShell script in my Azure DevOps pipeline step.

Import-Module : Could not load type 'System.Management.Automation.PSSnapIn' from assembly 'System.Management.Automation, Version=7.2.5.500,

Reason:

  • I used following ‘pwsh‘ task in my YAML file to execute the PowerShell script.
steps:
- pwsh: |
    <My Script>
  env:
    SYSTEM_ACCESSTOKEN: $(System.AccessToken)
  displayName: 'My Task'
  condition: succeeded()
  • For unversed, pwsh is the cross-platform edition of PowerShell built on .NET Core / .NET 5+; by contrast, powershell is the executable name of the legacy Windows PowerShell edition (v5.1-), built on the Windows-only .NET Framework (v4.8-).
  • The issue is ”System.Management.Automation’ is built on .Net Framework and ‘pwsh‘ can’t be used in this case.

Fix:

  • Replace pwsh task with powershell task in YAML file.
steps:
- powershell: |
    <My Script>
  env:
    SYSTEM_ACCESSTOKEN: $(System.AccessToken)
  displayName: 'My Task'
  condition: succeeded()

🙂

Categories: DevOps Tags: ,

Azure DevOps (ADO) | Pipelines | Publish and Access Build Artifacts from Staging Directory

September 23, 2021 Leave a comment

In this article, Lets see how to Publish and Access Build Artifacts using ADO Pipeline.

What are Build Artifacts:

  • Build artifacts are the files generated by your build.
  • In ADO Pipelines, We can use Build.ArtifactStagingDirectory Predefined Variable, on the agent where any artifacts are copied to.

Below are the steps to Publish and Download the Build Artefacts in a ADO Pipeline.

Steps to Publish the Build Artefacts:

  • Lets see how to Publish a file to Build Artifacts folder, using following ‘Power Platform Export Solution’ Task.
  • In the below Task, I am exporting a Solution from CRM Instance and publishing as $(Build.ArtifactStagingDirectory)\$(SolutionName)_managed.zip by using the ‘Solution Output File’ property.
  • In the above the ‘Solution Output File’ property, $(SolutionName) is my custom defined ‘Pipeline Variable’ contains Solution name.
    • ‘ALM_Base’ is my Dynamics Instance Solution name.
  • And $(Build.ArtifactStagingDirectory) is Predefined Variable.
  • Once the pipeline runs successfully, you can download the published artefacts (i.e., Solution folder) as follows.

Steps to Download the Build Artefacts:

  • Go to Pipeline Runs and select the last run.
  • Click on the below highlighted link.
  • Select the folder you want to download.
  • We also have an option to commit these files to Git using ‘Command Line Script’ task.

🙂

Categories: DevOps Tags: , ,

Azure DevOps (ADO) | Pipeline failure | You need the Git ‘GenericContribute’ permission

September 17, 2021 Leave a comment

While pushing the code to Repo using ‘Command Line Script’ task in ADO pipeline, got the following permission issue.

  • Following is the script used in ‘Command Line Script’ task to push the code to main branch.
echo commit all changes
git config user.email "rajeevpentyala@live.com"
git config user.name "Rajeev Pentyala"
git checkout main
git add --all
git commit -m "solution init"

echo push code to new repo
git  -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin main

Reason for the issue:

  • Account under which the ADO pipeline running, does not have required permissions to Push the code.

Fix:

  • Go to Settings -> Repositories
  • Under ‘Security’ tab, select the ‘ALM Build Service(User_Name)‘ and grant highlighted privileges.

🙂

Azure DevOps (ADO) | Pipeline failure | Failed to connect to Dataverse

September 16, 2021 Leave a comment

One of my ADO pipelines ‘Power Platform Publish Customizations’ task failed with “Failed to connect to Dataverse” error.

Reason:

  • ‘Power Platform Publish Customizations’ task’s ‘Authentication type’ was selected as ‘Username/password’ which does not have MFA support.
  • MFA (Multi Factor Authentication) was enabled on the Environment, which I was trying to connect from the Pipeline.
  • Since the ‘Power Platform Publish Customizations’ task’s ‘Authentication type’ was selected as ‘Username/password’ which does not support MFA, pipeline could not connect to the Dataverse environment.

Fix:

  • Create an ‘Application User’ by completing App Registration in Azure Active Directory and grant a Security Role.
  • In the ADO pipeline’s ‘Power Platform Publish Customizations’ task, select ‘Authentication type’ as ‘Service Principal’.
  • Make sure the ‘Service Connection’ configured properly with Azure App Registration details (i.e., Tenant ID, Application ID and Client Secret).
    • My connection name is ‘SP_ExpAugust21’ and details are as follows.
  • Save and run the pipeline and it should work now.

🙂