[Step by Step] Power Apps portals | Version Control website using Power Platform CLI

In this article, lets see how to version control the Portal website components using Power Platform CLI Portal commands.

What is Version Control

  • Version control, also known as source control, is the practice of tracking and managing changes to software code.
  • Version control software keeps track of every modification to the code in a special kind of database. If a mistake is made, developers can turn back the clock and compare earlier versions of the code to help fix the mistake while minimizing disruption to all team members.

Lets start working with Portal CLI with prerequisites.

Prerequisites:

Before we jump on to CLI, lets see how portal components can be edited from maker portal.

Edit Portal Website from Maker Portal

Once your Portal application created

  • Click on the portal app.
  • You will be redirected to your portal website as shown below.
  • If you notice, portal website is collection of many components (i.e., Web Pages, Lists, Files, etc…). You can edit the web site by using ‘Edit’ option directly from maker portal.
  • You will be redirected to Portal editor and can edit the components.

Challenges with editing Website directly from Maker Portal

As detailed in above section, you can edit the website directly from Maker Portal but at the cost of following challenges.

  • Its not version controlled. Meaning, if I want to revert a file with content from last stable release, it not possible with out version control.
  • As a Pro developer, you would feel comfortable editing webpage as html file than in the portal editor.

Use Power Platform CLI with portals

Lets see how we can solve versioning using CLI.

  • Open a new ‘Terminal’.
  • As a first step, you need to connect to your Dataverse environment by executing following command.
    • pac auth create -u {Environment_url}
  • Get the list of available Portals by executing following command.
    • pac paportal list
  • Copy the ‘WebSiteId’ as highlighted above.
  • Now download the website components using following command.
    • pac paportal download –path [PATH] -id [WebSiteId]
  • Website contents gets downloaded and saved locally as below.
  • Now, open this folder using VSCode and open the components.
  • Lets go ahead and edit the ‘Home’ web page content.
  • Upload the modified website components using following command.
    • pac paportal upload –path [Folder-location]
  • The upload only happens for content that’s been changed. In the above example, since the change is made to a webpage, content is uploaded only for the adx_webpage table.
  • Now, lets refresh the portal website and you should see the modified text.
  • If you dont see the changes, you can clear the server-side cache as below.
    • Navigate to the URL as follows: <portal_path>/_services/about.
    • Select Clear Cache.

Advantages of CLI for portals development

  • We can commit the website components to any of the version control tools like Azure DevOps, GitHub.
  • We can now use offline-like capability for portals customization by making changes to the portals content. And once all customizations or changes are saved, upload them to the portal.
  • Helps integrate seamlessly with any source control tools, such as “git”
  • Easily set up CI/CD pipelines

Limitations

  • Portals support for Power Platform CLI is limited to the tables listed here.

🙂

Categories: Portals Tags: , ,

[Code Snippet] PowerShell | Azure DevOps | Query Variable Group using API

In Azure DevOps, Variable groups store values and secrets that you might want to be passed into a YAML pipeline or make available across multiple pipelines.

Variable Group is a collection of variables with Name and Value pair.

Below is my variable Group ‘BillingSystemDictionary’ with 3 variables in my DevOps organization ‘RajeevPentyala0011’ and project ‘Demo’ (These details are required in PowerShell script).

In DevOps pipeline scenarios, you may need to fetch the variables under a variable group. Following is the PowerShell script which calls DevOps API and fetch variables by variable group name.

    # Form API authentication Header
    $headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
    $headers.Add("Authorization", "Bearer $(System.AccessToken)")
    $headers.Add("Content-Type", "application/json")

    # Pass Variable Group Name and read variables
    $variableGroupName = "BillingSystemDictionary"
    # 'projectid' pattern is https://dev.azure.com/{organization_name}/{projectname}
    $projectid = "https://dev.azure.com/RajeevPentyala0011/Demo"

    $variableGroupGetUrl = "$projectid/_apis/distributedtask/variablegroups?groupName=$variableGroupName*&queryOrder=IdDescending&api-version=6.0-preview.2"

    $queryResponseObject = Invoke-RestMethod -Uri $variableGroupGetUrl -Method GET -Headers $headers
 
    # $queryResponseObject will be of type pscustomobject
    # To fetch the variable value of "isactive"
    $variableName = "isactive"
    $variableValue = $queryResponseObject.value.variables.$variableName.value
    Write-Host "$variableName value is $variableValue"

🙂

Categories: CRM, DevOps Tags: , , ,

Power Apps | Application Insights | Query Telemetry data

Recently while troubleshooting as issue which was only occurring for a particular user, there was this question from a customer on “Determining how this user is accessing their system”.

To simplify the question, assume that you have a Model Driven App and end users might access it from following modes and you need to determine using which mode a particular User accessed the application in last 24 hours.

  • Different browsers
  • Plug-in Registration
  • XRMToolBox
  • Powerautomate
  • Postman
  • etc…

Coming back to troubleshooting, since the issue occurring for only a User, it would be a great help to fetch User access details like (Client OS, Region, Host Type, User Agent etc…) to narrow down the issue and provide fix.

In this article lets see how Application Insights telemetry data helps us troubleshoot issues related to application slowness, intermittent issues and many more.

Prerequisites:

Once the prerequisites are fulfilled, we need to set up Applications Insights for Power Platform environment.

Configure Application Insights for Power Platform Environment:

  • Set up export to your Applications Insights environment from the Power Platform admin center. Steps mentioned here.
  • Typically, one production environment or tenant maps to one Application Insights environment.
  • Post data export connection set up. Data will start being exported to your Application Insights environment, In the next 24 hours,.

Access the Application Insights Telemetry Data:

To access the Application Insights data

  • Connect to your Azure Portal.
  • Open the ‘Application Insights’ environment which you mapped to your Power Platform environment in the above section.
  • Under Monitor -> Logs, Run the Kusto queries.

Before jump on executing the queries, I highly recommend you to go through Telemetry-events-model-driven-apps and Telemetry-events-dataverse.

pageViews table
  • Application Insights page load data goes into the pageViews  table.
  • We need to query pageViews  table to fetch the Page load/User access related data.
  • Below is a sample query to pageViews table.

How to fetch the user access details.
  • To determine how the user is accessing the system, The userAgent attribute in the customDimensions field in the Application Insights pageViews table has this data.
  • Use the following query to get an overview of the different sources from where users are accessing the system:
pageViews
| summarize count() by tostring(customDimensions.userAgent), user_Id

dependencies
| where ['type'] == "UCI REQUEST"

Using the ‘Session ID’ to troubleshoot slowness or load issues:

  • Users can share their session ID from the About section of the App.
  • You can then use this ID to find issues by looking at all the activities in that session. Use the following query:
union *
| where session_Id == '[sessionIdHere]'

🙂

Azure DevOps | YAML | Could not load type ‘System.Management.Automation.PSSnapIn’

I got the following exception while executing a PowerShell script in my Azure DevOps pipeline step.

Import-Module : Could not load type 'System.Management.Automation.PSSnapIn' from assembly 'System.Management.Automation, Version=7.2.5.500,

Reason:

  • I used following ‘pwsh‘ task in my YAML file to execute the PowerShell script.
steps:
- pwsh: |
    <My Script>
  env:
    SYSTEM_ACCESSTOKEN: $(System.AccessToken)
  displayName: 'My Task'
  condition: succeeded()
  • For unversed, pwsh is the cross-platform edition of PowerShell built on .NET Core / .NET 5+; by contrast, powershell is the executable name of the legacy Windows PowerShell edition (v5.1-), built on the Windows-only .NET Framework (v4.8-).
  • The issue is ”System.Management.Automation’ is built on .Net Framework and ‘pwsh‘ can’t be used in this case.

Fix:

  • Replace pwsh task with powershell task in YAML file.
steps:
- powershell: |
    <My Script>
  env:
    SYSTEM_ACCESSTOKEN: $(System.AccessToken)
  displayName: 'My Task'
  condition: succeeded()

🙂

Categories: DevOps Tags: ,

Power Platform | Managed Environments (preview)

As a Power Platform Admin its imperative to get your environment insights such as how many Apps in are not being used or how many Apps were shared with security groups or what are the implied DLP policies etc. So far there was no inbuilt feature in Power Platform to get the these insights.

If you are familiar with Center of Excellence (CoE) Kit you must be knowing on how to get insights of unused Apps and many other metrics. However Center of Excellence (CoE) Kit needs to be installed separately.

With Managed Environments (preview) feature this gap is being addressed in Power Platform.

Managed Environments

  • Managed Environment is a suite of capabilities that allows admins to manage Power Platform at scale with more control, less effort, and more insights.
  • Admins can enable Managed Environments on any type of environment (i.e., Default/Trial/Sandbox/PROD).

There are four primary elements of Managed Environments:

Enable Managed Environments:

  • Admins can enable Managed Environments using the Power Platform admin center or by using PowerShell.
  • To enable or Edit Managed Environment connect to Admin Center.
  • On the command bar, for an unmanaged environment, select Enable Managed Environments. For a managed environment, select Edit Managed Environments.
  • Configure Managed Environments settings and then select Enable.
  • Copy and restore environment lifecycle operations requires the Managed Environments property to be the same between source and destination.
  • Users with either the global admin, Power Platform service admin or Dynamics 365 admin Azure Active Directory roles are allowed to enable Managed Environments.
  • Users with the Delegated Admin role aren’t allowed to change the Managed Environments property in an environment.
  • Users with the Environment Admin (i.e., System Administrator) security role aren’t allowed to change the Managed Environments property in an environment.

Weekly Digests:

  • Once you opt for Weekly digest (i.e., Checking the ‘Include this environment’ checkbox) in the ‘Enable Managed Environments’ pane, Analytics about your top apps and flows, your most impactful makers, and inactive resources you can safely clean up are distilled and delivered to your mailbox once per week.

Sharing Limits:

Managed Environments allow admins to influence how broadly makers can share canvas apps. There are two sharing controls.

Sharing controlSystem behavior when checked?
Exclude sharing with security groupsMakers cannot share canvas apps with any security group.
Admins get the option to proceed with setting a limit on individuals shared to.
Limit total individuals who can be shared toMakers cannot share canvas apps with more individuals than specified in the text box.
  • Limit total individuals who can be shared to is only enabled if Exclude sharing with security groups is checked.
  • Sharing rules are enforced when makers attempt to share an app. Sharing rules do not change the audience apps in an environment that are already shared with.
  • Once sharing rules are set in the Power Platform admin center it may take up to 1 hour for the latest sharing rules to be propagated in the system and enforced.

Data Policies:

  • A principal capability of Managed Environments is enforcing Data loss prevention (DLP) policies.
  • New environment filters have been introduced to the data policies page in the Power Platform admin center that will help you identify all the data policies that are applied to an environment.
  • The environment filters are exclusively available for managed environments.
  • Open the Edit Managed Environments settings panel for a managed environment. In the Data policies section, select the See active data policies for this environment.
  • The data policies page opens in a new tab and displays only the data policies applied to the managed environment.

License considerations:

  • Managed Environments represents a value-add on top of existing premium Power Platform capabilities.
  • All low code assets (apps/flows) in a managed environment become premium and can be licensed using any of the Power Platform licensing options (per user, per app/flow or pay-as-you-go) or Dynamics 365 licenses that give premium usage rights.
  • Users must have a qualifying license to access the assets.
  • During the public preview the premium license requirement for applications and flows within a managed environment is not enforced.

Please refer the docs for more info.

Note:

🙂

PCF component import error | Publisher prefix for this control does not match the prefix for the solution

If you are New or Beginner in Power Apps component framework , I recommend you to refer PCF beginner article first.

Coming back to the blog post, other day while importing a solution with a PCF component, solution import failed with following error

Publisher prefix XXXX for this control XX.XXXX does not match the prefix XXX for the solution with Id xxxx-xxx-xxxxxxx-xxxxx-xxxxxx"

Reason:

To understand the issue better, lets reproduce the issue by building a simple pcf control by following steps below:

  • Open Visual Studio Code and point to an empty folder.
  • Open a Terminal and execute following command to create a Control by name ‘Helloworld’.
 pac pcf init -n HelloWorld -ns sample -t field -npm true
  • Now build the control by running “npm run build” command.
  • Now that we have control in built state, let me push the control to my development Dataverse environment.
  • Execute “pac auth create –url https://myorg.crm.dynamics.com&#8221; to connect to Dataverse environment first.
  • Then, push the control to the connected environment using “pac pcf push –publisher-prefix sample command.
  • Above ‘pac pcf push‘ command creates a new unmanaged solution with naming convention “PowerAppsTools_{publisher_prefix}”.
    • In my case, a new solution “PowerAppsTools_sample” got created, as I mentioned the Publisher Prefix name as sample.
  • So far we could build a new PCF control and pushed to the Dataverse instance.
  • In my Development Dataverse environment, I already have a solution by name “MyCoreSolution” with Publisher ‘Rajeev’ having a prefix as ‘raj‘.
  • Now I want to include the PCF control which we just built to my “MyCoreSolution” solution and deploy to another environment.
  • So add the “HelloWorld” control to MyCoreSolution solution and export and save a zip file in your machine.
  • Open the new Environment and try importing “MyCoreSolution” solution, which we exported in previous step.
  • Import fails with following error
Solution "MyCoreSolution" failed to import: Import Solution Failed: CustomControls with Name sample.HelloWorld Import Solution Failed with following error: Publisher prefix sample for this control sample.HelloWorld does not match the prefix raj for the solution

Fix/Best Practice:

  • Hope you got an idea with above example on what’s wrong with solution import.
  • Because the PCF control prefix (i.e.,sample) is different than the “MyCoreSolution” solution’s Publisher prefix (i.e.,raj) we got the exception during import.
  • To avoid such issues, make sure your PCF control prefix matches with your Solution’s Publisher prefix, while importing the control to a different Dataverse environment.

🙂

Categories: PowerApps Tags: , ,

General availability of Power Apps for Windows

With “Power Apps for Windows”

  • You can find all your canvas and model-driven apps including your Dynamics 365 apps and use them online or offline just like you can on iOS and Android!
  • You can configure model-driven apps to work automatically offline by creating an offline profile. For canvas apps, you can use the LoadData/SaveData functions to create a seamless offline experience.
  • Get device capabilities like camera, microphone, file picker, barcode scanning, geo-location and many others.

This app has been optimized for great app performance and supports the latest advanced feature provided by the Power Platform like the native Dataverse connector, Guest access and AI Builder.

Installing the App

  • Go to the Microsoft Store and install Power Apps for Windows.
  • When the app is installed, open it and sign in.
  • All your canvas and model-driven apps including your Dynamics 365 apps will be shown up as below.

Refer official documentation for more details.

🙂

Categories: PowerApps Tags:

PCF | Using Fluent UI and React | Error during react installation

Other day while building PCF project using Microsoft Fluent UI and React, got an error while installing ‘React’ using following command.

npm install react react-dom @fluentui/react

Reason and Fix:

  • Reason was, React 18 was installed in my machine and Fluent UI will not install since it requires a React version less than 18.
  • To fix the issue run following commands which uninstalls the current version and installs Fluent compatible version of React.
npm uninstall react react-dom @fluentui/react
npm install react@16.8.6 react-dom@16.8.6 @fluentui/react@8.29.0

Please refer Scott Durow article on the same issue.

🙂

Categories: PowerApps Tags: , ,

Power Platform CLI | Pack and Unpack solution using map xml

June 16, 2022 1 comment

Microsoft Power Platform CLI is a developer Command Line Interface, that empowers developers and ISVs to perform various operations related to environment lifecycle, authentication, and work with Dataverse environments, solution packages, portals, code components, and so on.

In this article lets learn how to use Pack and Unpack Solution Commands with map xml file.

Getting started with CLI:

  • There are 2 ways to install and use CLI.
    • Standalone Power Platform CLI :
    • Using Power Platform Tools for Visual Studio Code:
      • Follow steps specified here.
      • Power Platform Tools for Visual Studio Code is available on Windows 10, Windows 11, Linux, and MacOS.

Understanding Unpack and Pack commands:

  • Solutions are the mechanism for implementing application lifecycle management (ALM).
  • Its imperative to version control the Solution components to platforms such as Azure DevOps, Git Hub to maintain healthy ALM.
  • With the help of Unpack and Pack commands we can commit the Solution Components which enables version control.
Using Unpack command:
  • Lets you unpack solution zip files after they’ve been exported to the filesystem.
  • First export the solution to your machine.
    • In my case, I have ‘almplugins_1_0_0_0‘ solution exported and placed in ‘D’ drive.
  • To use ‘Unpack’, open the Command Prompt in ‘Admin Mode’ and run following command.
pac solution unpack -z D:\Practice\CLI\almplugins_1_0_0_0.zip -f D:\Practice\CLI\almpluginsunpacked.
  • Once the command executed successfully, Solution components unpack to “D:\Practice\CLI\almpluginsunpacked” path as specified in unpack command.
    • Note: My ‘almplugins_1_0_0_0.zip‘ solution had only a ‘Plug-in assembly’ and ‘Plugin Steps’ hence you would see only those designated folders.
    • If the solution contains other components like Entities, Security Roles, Flows, etc. other relevant folders gets generated.

Using pack command:
  • Lets you pack files on a filesystem into a solution zip file.
  • Now, lets pack the Solution Components from the almpluginsunpacked folder using following pack command.
pac solution pack -z D:\Practice\CLI\almpluginpacked.zip -f D:\Practice\CLI\almpluginsunpacked.
  • Once the command executed, you would find a new zip folder with name (i.e.,almpluginpacked.zip), specified in ‘pack’ command.

Now that you got some knowledge on usage of unpack and pack. Now lets use them with map xml file.

Why we need map xml:

  • Files that are built in an automated build system, such as plug-in assemblies, are typically not checked into source control.
  • If you open the almpluginsunpacked folder generated by unpack, there will be .dlls which are not supposed to be presented.
  • Also, while packing the solution components using ‘pack’, you would want the latest .dlls from your Plugin VS project to be packed.
  • map xml helps in above 2 scenarios. Lets see the usage of map xml in next section.
<?xml version="1.0" encoding="utf-8"?>
<Mapping>
    <FileToFile map="PFESamplePlugins.dll" to="..\..\src\\Plugins\bin\**\PFE.Sample.Plugins.dll" />

    <FileToPath map="WebResources\*.*" to="..\..\src\WebResources\dist\**" />
    <FileToPath map="WebResources\**\*.*" to="..\..\src\WebResources\dist\**" />
</Mapping>

unpack command with map xml:

  • As my almplugins_1_0_0_0 solution has 2 Plug-in assemblies, I got following 2 folders by executing unpack command.
  • If I go deep in to folders, there are alm-plugins2.dll and almplugins.dll as they were unpacked from almplugins_1_0_0_0 solution.
  • Lets run unpack again with map.xml.
Prepare map xml
  • My Plug-in projects folder structure is as follows.
  • Path of my alm-plugins2.dll will be C:\Users\rajeevpe\Source\repos\alm-plugins-v2\alm-plugins2\bin\Release\alm-plugins2.dll
  • Path of my alm.plugins.dll will be C:\Users\rajeevpe\Source\repos\alm-plugins-v2\alm-plugins\bin\Release\alm.plugins.dll
  • The map.xml structure will be as below
<?xml version="1.0" encoding="utf-8"?>
<Mapping>
    <FileToFile map="D:\Practice\CLI\almpluginsunpacked\PluginAssemblies\**\alm-plugins2.dll" to="C:\Users\rajeevpe\Source\repos\alm-plugins-v2\alm-plugins2\bin\Release\alm-plugins2.dll" />
    <FileToFile map="D:\Practice\CLI\almpluginsunpacked\PluginAssemblies\**\almplugins.dll" to="C:\Users\rajeevpe\Source\repos\alm-plugins-v2\alm-plugins\bin\Release\alm.plugins.dll" />    
</Mapping>
  • Save the map xml file with above content and execute following unpack command.
pac solution unpack -z D:\Practice\CLI\almplugins_1_0_0_0.zip -f D:\Practice\CLI\almpluginsunpacked\. -m D:\Practice\CLI\map.xml
  • This time, map file was considered by unpack command and skipped the generation of dlls in unpack folder location.

pack command with map xml:

As we have seen, unpack with map file would not generate dlls in to unpacked folder. The objective of map file during pack is to consider the latest .dlls from your Plugin VS project location.

Since we already prepared map xml file, we can refer the same in pack command as below.

pac solution pack -z D:\Practice\CLI\almplugins_1_0_0_0.zip -f D:\Practice\CLI\almpluginsunpacked\. -m D:\Practice\CLI\map.xml
  • If you notice, .dlls were mapped (i.e., copied) from plug-in VS projects location and packed in to a zip folder which can imported to target environments.

Hope you gained some knowledge on usage of pack and unpack commands using map xml file. In this article I only considered Plug-ins but map file can be used with Web resources as well.

Following articles provides more details.

Note:

  • Azure DevOps pipeline’s ‘Power Platform Pack Solution’ and ‘Power Platform UnPack Solution’ tasks uses PAC CLI and at the time of writing this article, both ‘Power Platform Pack Solution’ and ‘Power Platform UnPack Solution’ tasks does not support map xml.

🙂

Categories: PowerApps Tags: ,

Power Platform CLI | Authenticate environment using device code

For unversed, Microsoft Power Platform CLI is a developer CLI that empowers developers and ISVs to perform various operations related to environment lifecycle, authentication, and work with Dataverse environments, solution packages, portals, code components, and so on.

In this blog post, lets see how to connect to an Environment using Power CLI ‘Device Code’ authentication option.

Using CLI:

  • Open Command Prompt.
  • Execute pac command to make sure the Power Platform CLI is installed on your machine. You get following screen, if CLI is already installed.
  • If not installed, Download and install Microsoft Power Platform CLI.
  • Now to connect to Power platform environment, we can use following pac auth create command by passing either Username & Password or ApplicationID & ClientSecret combination.
pac auth create [--name] [--kind] [--url] [--username] [--password] [--applicationId] [--clientSecret] [--tenant] [--cloud] [--deviceCode]
  • We can also authenticate using ‘deviceCode’, which I will explain in next section.

Authenticate using Device Code:

Now we can also authenticate to an environment by generating a ‘Device Code’ as mentioned in steps below.

  • From console, run following pac auth create command
pac auth create --url https://{your environment}.crm.dynamics.com -dc
  • You would get a notification with ‘Device Code‘, “To sign in, use a web browser to open the page https://microsoft.com/devicelogin and enter the code RXXXXZ to authenticate.
  • Complete the sign in by providing your credentials.
  • You will get following message post successful authentication.
  • Now you can close the browser and good to start executing pac cli commands from console.

Key Points:

  • Currently, Microsoft Power Platform standalone CLI is supported only on Windows 10 and Windows 11.
  • Power Platform Tools for Visual Studio Code is available on Windows 10, Windows 11, Linux, and MacOS.

🙂

Categories: PowerApps Tags: , , ,