Archive

Author Archive

GitHub Copilot | Code suggestions

September 2, 2022 Leave a comment

If you are a developer, Isn’t the below visual is scary good? You declare a function with meaningful name and the code is ready for you as a suggestion. Say ‘Hi’ to GitHub Copilot code suggestions.

About GitHub Copilot:

  • GitHub Copilot is an AI pair programmer that offers autocomplete-style suggestions as you code.
  • It is optimized to help you write Python, JavaScript, TypeScript, Ruby, Go, C#, or C++.
  • GitHub Copilot is available as an extension in Visual Studio Code, Visual Studio, Neovim and the JetBrains suite of IDEs. For more information, see “Getting started with GitHub Copilot.”

Licensing:

  • GitHub Copilot is available to GitHub customers with a personal account on GitHub.com.
  • GitHub Copilot is free to use for verified students and maintainers of popular open source projects.
  • If you are not a student or maintainer of a popular open source project, you can try GitHub Copilot for free with a one-time 60 day trial.
  • After the free trial, you will need a paid subscription for continued use.
  • For more information, see “About billing for GitHub Copilot.”

GitHub Copilot as Visual Studio Extension:

  • You must have Visual Studio 2022 17.2 or later installed.
  • In the Visual Studio toolbar, click Extensions, then click Manage Extensions.
  • In the “Manage Extensions” window, click Visual Studio Marketplace, search for the GitHub Copilot extension, then click Download.
  • Refer link for more details.

🙂

Categories: Misc Tags: ,

[Step by Step] Dataverse | Plugins | Using Dependent Assemblies

In this article, I am going to highlight the limitations of conventional plug-in development and how the Dependent Assemblies feature addresses those limitations.

Limitations of the conventional plugins:

As we know that the plugins can be registered only as an individual .Net assembly, we can’t include another assembly or a resource file within a plug-in.

  • For example, you may want to use Newtonsoft.Json.dll or another assembly. You may want to access a list of localized strings which is not possible using native plugin development.

Workaround to support multiple assemblies:

ILMerge
  • One way to include another assembly is to combine it into one using ILMerge.
  • While ILMerge worked for many, it was never supported by Dataverse and it didn’t always work.
  • ILMerge is no longer being maintained.
Shared Projects
  • Other approach is by using Shared Projects.
  • In Shared Projects, the unit of reuse is the source code, and the shared code is incorporated into each assembly that references the shared project.

Both ‘ILMerge’ and ‘Shared Project’ options are workarounds and not recommended. Lets see how ‘Dependent Assemblies’ addresses these gaps.

Dependent Assemblies:

  • With dependent assemblies, rather than register an individual .NET assembly, you will upload a NuGet Package that contains your plug-in assembly and any dependent assemblies.
  • Unlike ILMerge, we can also include other file resources, such as JSON files containing localized strings. This NuGet package file is stored in a new table called “PluginPackage”.

If things are unclear at this point, dont worry, I will explain step by step with 2 scenarios. But before that, lets get the prerequisites ready.

Prerequisites:

  • Visual studio 2019 or later.
  • Download latest Plugin registration tool.
  • Download and Install Power Platform CLI
    • Don’t worry if you are not familiar with Power Platform CLI. We will not use complex commands.
    • We will only use one simple command to create Plugin project template.
  • Install NuGet Package Explorer
    • This is optional but a useful tool.
  • A blank solution in your Dataverse environment.

Once you got the prerequisites, lets gets started by creating plug-in assembly project.

Create a Plug-in project:

We will create plug-in project using Power Platform CLI command. As mentioned earlier, we only use CLI to create plug-in project. Steps as follows.

  • Create a new folder and open the ‘Command Prompt’.
  • Execute “pac plugin init” command.
  • “pac plugin init” command creates a .NET Framework Class library project which looks as below.
  • Open the project using Visual Studio and build. A new NuGet package gets generated for the project.

Now that we have project created, lets proceed with our first scenario.

Scenario 1 : Using ‘Newtonsoft.Json.dll’ in Plug-in project

Lets see how easily ‘Newtonsoft.Json.dll’ can be used in our plug-in project. The scenario I am using is on pre create off Account, I am setting ‘Name’ field by parsing json string.

  • Open the plug-in project created in previous step.
  • Rename the ‘Plugin1.cs’ file to ‘PreCreateAcount.cs’.
  • Install the ‘Newtonsoft.json’ package using Nuget Package Manager.
  • Add the following code snippet which uses JsonConvert.DeserializeObject method of NewtonSoft to “PreCreateAcount.cs” file
  • Build the project.
  • Note: With ‘Dependent Assemblies’ we dont need to sign the assemblies. Signing is optional. However ‘pac plugin init‘ command signs the assembly by default. So, go to project properties and uncheck the ‘Sign the assembly’.

View your NuGet package
  • Once you build the project, as an optional step, you can NuGet Package Explorer to examine the NuGet package.
  • Little bit of Dependent Assembly back ground logic , if you are interested.
    • When you upload your NuGet package, any assemblies that contain plugin classes will be registered in PluginAssembly table and associated with the PluginPackage.
    • At runtime, Dataverse copies the contents of the NuGet package from the PluginPackage row and extracts it to the sandbox runtime.
    • This way, any dependent assemblies needed for the plug-in are available.
Deploying the nuget package:
  • Build the project and open the plug-in registration tool.
  • Use ‘Register New Package’ and map the .nupkg file which will be available under your plug-in project bin\debug folder.
  • Select the Dataverse solution which we created as part of prerequisites. You can also use any existing solution.
  • Click on ‘Import’.
  • As a next step, register a new step on Pre Account Create, like how we used to create in regular plugin registration.
  • Post step registration, your Package should look similar as below.

Test the plugin:

As we developed and registered the plug-in package, its time to test.

  • Open a new Account from and provide some name.
  • Up on save, ‘Name’ should get replaced with ‘Rajeev Pentyala’ which is what we developed in our plug-in.

Scenario 2 : Using ‘Newtonsoft.Json.dll’ in another C# project

In scenario 1, we have seen how to use ‘Newtonsoft.Json.dll’ in the same plug-in assembly project. In scenario 2, lets see how to use another project.

  • Add a new ‘Class Library’ project. I named project as ‘Helper’.
  • Install the ‘Newtonsoft.json’ package using Nuget Package Manager.
  • Rename the ‘Class1.cs’ to ‘AccountHelper.cs’
  • Add following code snippet.
  • In the plug-in assembly project, add ‘Helper’ project as a project reference.
  • In the ‘PreCreateAcount.cs’ plugin class, call the AccountHelper.GetName()
  • Build the solution.

Update the nuget package:

As we changed the logic to fetch ‘Name’ from ‘Account Helper’ and built the solution. We should go and update the Plugin package.

  • To update the Plugin Package, in the ‘Plugin Registration Tool’, go to View ->Display by Package.
  • Select the latest .nupkg file and update.
  • Note : We have to go to View ->Display by Package to delete the package.
  • Test by creating a new Account record.

Limitations:

  • A plugin package is limited to 16 MB in size or 50 assemblies.
  • Workflow extensions, also known as workflow assemblies, workflow activities or custom workflow activities are not supported.
  • On-premises environments are not supported.

Hope you got some idea about ‘Dependency Assemblies’. Happy learning 🙂

[Step by Step] Model Driven App | Grids | Navigate to custom page on row click

By default, performing any of the following grid actions opens the table record:

  • Double-clicking the data row, or selecting the primary column link in the row.
  • Selecting a data row, and then pressing the Enter key.
  • On a touch-enabled device, selecting a data row.

In this article, lets see how to override the default grid click behavior and navigate to a Custom Page. Same approach can be used to navigate to custom URL, show alert etc…

Before jump on to implementing override default behavior, lets understand how the same requirement can be achieved using ribbon button.

Conventional way of achieving this requirement:

Assume that you would want to redirect to custom url (i.e., Google, Bing, etc…) from the Main Grid. One common approach would be:

  • Add a custom jScript function with a desired navigation logic upload as a webresource.
  • Add a ribbon button on grid and map the jScript function.
  • Select the grid record(s) and click the button which will redirect you to the custom url.

In the above approach, the overhead is every time you have to select record in grid and click the ribbon button.

With the ‘override default behavior’ approach would be pretty much same except you don’t need to click the ribbon button everytime.

Using override default behavior:

Let me explain the override default behavior of main grid, by redirecting to a custom page on click of grid row by following these steps.

Create a custom page:
  • As we will be redirecting to a custom page on grid click, first lets create a custom page.
  • I’ve created a simple custom page with ‘Account Name’, ‘Account Number’ fields and a ‘Back’ button.
  • Copy the ‘Name‘ of the Custom Page, which we would using in the next steps.
  • Next, you must add your ‘Custom Page’ to the Model Driven App using editor.

Create a webresource:

We will be adding the navigation to ‘Custom Page’ logic in jScript function. If you want to navigate to custom url or display alert just put the appropriate logic in this jscript function.

  • Create a jScript file and copy following ‘openCustomPage‘ function.
    • Set ‘name’ as the ‘Name‘ of the Custom Page copied in previous step.
  • Save this jScript file as a webresource.

Create a new ribbon button

We need to add a button in this ‘override default behavior’ approach as well but with slight changes.

The placement of the button is important. As we want to override the ‘Main Grid’ behavior, we need to add a new button on the ‘Main Grid’.

If you want to override the Contacts subgrid behavior on the Accounts form, you need to add the new button on to the contact form.

Lets use Ribbon Workbench to add a new button on ‘Account’ Main Grid.

  • Once you add a button, create a command definition with Mscrm.OpenRecordItem as the ID.
    • This is the key. The application looks for the Mscrm.OpenRecordItem command ID, when you try to open a record from the grid and if one is present, will execute the custom action instead of performing the default behavior of opening the table record.
  • Add a ‘Custom Javascript Action’ to the command and and include the CrmParameter with the value SelectedControlSelectedItemReferences.
  • Following is the ‘RibbonDiffXml’ for your reference.
<?xml version="1.0" encoding="utf-16"?>
<RibbonDiffXml xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <CustomActions>
    <CustomAction Id="raj.Mscrm.OpenRecordItem.CustomAction" Location="Mscrm.HomepageGrid.account.MainTab.Management.Controls._children" Sequence="65">
      <CommandUIDefinition>
        <Button Alt="$LocLabels:Mscrm.OpenRecordItem.Alt" Command="Mscrm.OpenRecordItem" Id="Mscrm.OpenRecordItem" LabelText="$LocLabels:Mscrm.OpenRecordItem.LabelText" Sequence="65" TemplateAlias="o3" ToolTipTitle="$LocLabels:Mscrm.OpenRecordItem.ToolTipTitle" ToolTipDescription="$LocLabels:Mscrm.OpenRecordItem.ToolTipDescription" />
      </CommandUIDefinition>
    </CustomAction>
  </CustomActions>
  <Templates>
    <RibbonTemplates Id="Mscrm.Templates" />
  </Templates>
  <CommandDefinitions>
    <CommandDefinition Id="Mscrm.OpenRecordItem">
      <EnableRules />
      <DisplayRules />
      <Actions>
        <JavaScriptFunction FunctionName="openCustomPage" Library="$webresource:raj_account_js">
          <CrmParameter Value="SelectedControlSelectedItemReferences" />
        </JavaScriptFunction>
      </Actions>
    </CommandDefinition>
  </CommandDefinitions>
  <RuleDefinitions>
    <TabDisplayRules />
    <DisplayRules />
    <EnableRules />
  </RuleDefinitions>
  <LocLabels>
    <LocLabel Id="Mscrm.OpenRecordItem.LabelText">
      <Titles>
        <Title description="Open Record Item" languagecode="1033" />
      </Titles>
    </LocLabel>
  </LocLabels>
</RibbonDiffXml>

Note: You can enable or disable the newly created button; doing either will still override the open default behavior.

  • Publish the Ribbon changes.
  • That’s all needed. If you notice in both ‘conventional’ and ‘override default behavior’ approaches, the ribbon button is common but the ‘override default behavior’ does not require you to explicitly click the button.

  • Refer this Microsoft Docs article for more details.

🙂

Azure DevOps | Power Platform Build Tools | WhoAmI | Capture Environment ID

Azure DevOps along with Power Platform Build Tools results a healthy Power Platform ALM.

In this article, I will explain how to use Power Platform WhoAmI task and capture the ‘Environment ID’.

What is PowerPlatformWhoAmi task:

  • Verifies a Power Platform environment service connection by connecting and making a WhoAmI request.
  • This task can be useful to include early in the pipeline, to verify connectivity before processing begins.

YAML snippet of WhoAmi:

Following is the YAML snippet which we would need to include in pipeline.

- task: microsoft-IsvExpTools.PowerPlatform-BuildTools.whoami.PowerPlatformWhoAmi@0
  displayName: 'Power Platform WhoAmI'

  inputs: 
#   Service Principal/client secret (supports MFA)
    authenticationType: PowerPlatformSPN
    PowerPlatformSPN: '{My service connection}'
  • This is how the pipeline YAML looks like.
    • Please note that we must have “PowerPlatformToolInstaller” task before executing “PowerPlatformWhoAmi” task.

Output of PowerPlatformWhoAmi:

  • When you run the pipeline, you would get following outcome for ‘PowerPlatformWhoAmi’.
  • In the output, there is this useful ‘Organization Information’ (Highlighted above) like Org ID, Unique Name, Environment ID etc…

Capture the ‘Environment ID’:

  • As we noticed, in the output of PowerPlatformWhoAmi, we could see useful ‘Organization Information’.
  • PowerPlatformWhoAmi output will be persisted in “BuildTools” Environment Variables.
  • To capture the ‘Environment ID’ use this convention $(BuildTools.EnvironmentId).
  • Use ‘echo’ to test the EnvironmentId as shown below.
  • Once you run the pipeline, you should see the EnvironmentId. As $(BuildTools.EnvironmentId) is an Environment variable, you can use in subsequent pipelines as well.

🙂

[Step by Step] Power Apps portals | Version Control website using Power Platform CLI

In this article, lets see how to version control the Portal website components using Power Platform CLI Portal commands.

What is Version Control

  • Version control, also known as source control, is the practice of tracking and managing changes to software code.
  • Version control software keeps track of every modification to the code in a special kind of database. If a mistake is made, developers can turn back the clock and compare earlier versions of the code to help fix the mistake while minimizing disruption to all team members.

Lets start working with Portal CLI with prerequisites.

Prerequisites:

Before we jump on to CLI, lets see how portal components can be edited from maker portal.

Edit Portal Website from Maker Portal

Once your Portal application created

  • Click on the portal app.
  • You will be redirected to your portal website as shown below.
  • If you notice, portal website is collection of many components (i.e., Web Pages, Lists, Files, etc…). You can edit the web site by using ‘Edit’ option directly from maker portal.
  • You will be redirected to Portal editor and can edit the components.

Challenges with editing Website directly from Maker Portal

As detailed in above section, you can edit the website directly from Maker Portal but at the cost of following challenges.

  • Its not version controlled. Meaning, if I want to revert a file with content from last stable release, it not possible with out version control.
  • As a Pro developer, you would feel comfortable editing webpage as html file than in the portal editor.

Use Power Platform CLI with portals

Lets see how we can solve versioning using CLI.

  • Open a new ‘Terminal’.
  • As a first step, you need to connect to your Dataverse environment by executing following command.
    • pac auth create -u {Environment_url}
  • Get the list of available Portals by executing following command.
    • pac paportal list
  • Copy the ‘WebSiteId’ as highlighted above.
  • Now download the website components using following command.
    • pac paportal download –path [PATH] -id [WebSiteId]
  • Website contents gets downloaded and saved locally as below.
  • Now, open this folder using VSCode and open the components.
  • Lets go ahead and edit the ‘Home’ web page content.
  • Upload the modified website components using following command.
    • pac paportal upload –path [Folder-location]
  • The upload only happens for content that’s been changed. In the above example, since the change is made to a webpage, content is uploaded only for the adx_webpage table.
  • Now, lets refresh the portal website and you should see the modified text.
  • If you dont see the changes, you can clear the server-side cache as below.
    • Navigate to the URL as follows: <portal_path>/_services/about.
    • Select Clear Cache.

Advantages of CLI for portals development

  • We can commit the website components to any of the version control tools like Azure DevOps, GitHub.
  • We can now use offline-like capability for portals customization by making changes to the portals content. And once all customizations or changes are saved, upload them to the portal.
  • Helps integrate seamlessly with any source control tools, such as “git”
  • Easily set up CI/CD pipelines

Limitations

  • Portals support for Power Platform CLI is limited to the tables listed here.

🙂

Categories: Portals Tags: , ,

[Code Snippet] PowerShell | Azure DevOps | Query Variable Group using API

In Azure DevOps, Variable groups store values and secrets that you might want to be passed into a YAML pipeline or make available across multiple pipelines.

Variable Group is a collection of variables with Name and Value pair.

Below is my variable Group ‘BillingSystemDictionary’ with 3 variables in my DevOps organization ‘RajeevPentyala0011’ and project ‘Demo’ (These details are required in PowerShell script).

In DevOps pipeline scenarios, you may need to fetch the variables under a variable group. Following is the PowerShell script which calls DevOps API and fetch variables by variable group name.

    # Form API authentication Header
    $headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
    $headers.Add("Authorization", "Bearer $(System.AccessToken)")
    $headers.Add("Content-Type", "application/json")

    # Pass Variable Group Name and read variables
    $variableGroupName = "BillingSystemDictionary"
    # 'projectid' pattern is https://dev.azure.com/{organization_name}/{projectname}
    $projectid = "https://dev.azure.com/RajeevPentyala0011/Demo"

    $variableGroupGetUrl = "$projectid/_apis/distributedtask/variablegroups?groupName=$variableGroupName*&queryOrder=IdDescending&api-version=6.0-preview.2"

    $queryResponseObject = Invoke-RestMethod -Uri $variableGroupGetUrl -Method GET -Headers $headers
 
    # $queryResponseObject will be of type pscustomobject
    # To fetch the variable value of "isactive"
    $variableName = "isactive"
    $variableValue = $queryResponseObject.value.variables.$variableName.value
    Write-Host "$variableName value is $variableValue"

🙂

Categories: CRM, DevOps Tags: , , ,

Power Apps | Application Insights | Query Telemetry data

Recently while troubleshooting as issue which was only occurring for a particular user, there was this question from a customer on “Determining how this user is accessing their system”.

To simplify the question, assume that you have a Model Driven App and end users might access it from following modes and you need to determine using which mode a particular User accessed the application in last 24 hours.

  • Different browsers
  • Plug-in Registration
  • XRMToolBox
  • Powerautomate
  • Postman
  • etc…

Coming back to troubleshooting, since the issue occurring for only a User, it would be a great help to fetch User access details like (Client OS, Region, Host Type, User Agent etc…) to narrow down the issue and provide fix.

In this article lets see how Application Insights telemetry data helps us troubleshoot issues related to application slowness, intermittent issues and many more.

Prerequisites:

Once the prerequisites are fulfilled, we need to set up Applications Insights for Power Platform environment.

Configure Application Insights for Power Platform Environment:

  • Set up export to your Applications Insights environment from the Power Platform admin center. Steps mentioned here.
  • Typically, one production environment or tenant maps to one Application Insights environment.
  • Post data export connection set up. Data will start being exported to your Application Insights environment, In the next 24 hours,.

Access the Application Insights Telemetry Data:

To access the Application Insights data

  • Connect to your Azure Portal.
  • Open the ‘Application Insights’ environment which you mapped to your Power Platform environment in the above section.
  • Under Monitor -> Logs, Run the Kusto queries.

Before jump on executing the queries, I highly recommend you to go through Telemetry-events-model-driven-apps and Telemetry-events-dataverse.

pageViews table
  • Application Insights page load data goes into the pageViews  table.
  • We need to query pageViews  table to fetch the Page load/User access related data.
  • Below is a sample query to pageViews table.

How to fetch the user access details.
  • To determine how the user is accessing the system, The userAgent attribute in the customDimensions field in the Application Insights pageViews table has this data.
  • Use the following query to get an overview of the different sources from where users are accessing the system:
pageViews
| summarize count() by tostring(customDimensions.userAgent), user_Id

dependencies
| where ['type'] == "UCI REQUEST"

Using the ‘Session ID’ to troubleshoot slowness or load issues:

  • Users can share their session ID from the About section of the App.
  • You can then use this ID to find issues by looking at all the activities in that session. Use the following query:
union *
| where session_Id == '[sessionIdHere]'

🙂

Azure DevOps | YAML | Could not load type ‘System.Management.Automation.PSSnapIn’

I got the following exception while executing a PowerShell script in my Azure DevOps pipeline step.

Import-Module : Could not load type 'System.Management.Automation.PSSnapIn' from assembly 'System.Management.Automation, Version=7.2.5.500,

Reason:

  • I used following ‘pwsh‘ task in my YAML file to execute the PowerShell script.
steps:
- pwsh: |
    <My Script>
  env:
    SYSTEM_ACCESSTOKEN: $(System.AccessToken)
  displayName: 'My Task'
  condition: succeeded()
  • For unversed, pwsh is the cross-platform edition of PowerShell built on .NET Core / .NET 5+; by contrast, powershell is the executable name of the legacy Windows PowerShell edition (v5.1-), built on the Windows-only .NET Framework (v4.8-).
  • The issue is ”System.Management.Automation’ is built on .Net Framework and ‘pwsh‘ can’t be used in this case.

Fix:

  • Replace pwsh task with powershell task in YAML file.
steps:
- powershell: |
    <My Script>
  env:
    SYSTEM_ACCESSTOKEN: $(System.AccessToken)
  displayName: 'My Task'
  condition: succeeded()

🙂

Categories: DevOps Tags: ,

Power Platform | Managed Environments (preview)

As a Power Platform Admin its imperative to get your environment insights such as how many Apps in are not being used or how many Apps were shared with security groups or what are the implied DLP policies etc. So far there was no inbuilt feature in Power Platform to get the these insights.

If you are familiar with Center of Excellence (CoE) Kit you must be knowing on how to get insights of unused Apps and many other metrics. However Center of Excellence (CoE) Kit needs to be installed separately.

With Managed Environments (preview) feature this gap is being addressed in Power Platform.

Managed Environments

  • Managed Environment is a suite of capabilities that allows admins to manage Power Platform at scale with more control, less effort, and more insights.
  • Admins can enable Managed Environments on any type of environment (i.e., Default/Trial/Sandbox/PROD).

There are four primary elements of Managed Environments:

Enable Managed Environments:

  • Admins can enable Managed Environments using the Power Platform admin center or by using PowerShell.
  • To enable or Edit Managed Environment connect to Admin Center.
  • On the command bar, for an unmanaged environment, select Enable Managed Environments. For a managed environment, select Edit Managed Environments.
  • Configure Managed Environments settings and then select Enable.
  • Copy and restore environment lifecycle operations requires the Managed Environments property to be the same between source and destination.
  • Users with either the global admin, Power Platform service admin or Dynamics 365 admin Azure Active Directory roles are allowed to enable Managed Environments.
  • Users with the Delegated Admin role aren’t allowed to change the Managed Environments property in an environment.
  • Users with the Environment Admin (i.e., System Administrator) security role aren’t allowed to change the Managed Environments property in an environment.

Weekly Digests:

  • Once you opt for Weekly digest (i.e., Checking the ‘Include this environment’ checkbox) in the ‘Enable Managed Environments’ pane, Analytics about your top apps and flows, your most impactful makers, and inactive resources you can safely clean up are distilled and delivered to your mailbox once per week.

Sharing Limits:

Managed Environments allow admins to influence how broadly makers can share canvas apps. There are two sharing controls.

Sharing controlSystem behavior when checked?
Exclude sharing with security groupsMakers cannot share canvas apps with any security group.
Admins get the option to proceed with setting a limit on individuals shared to.
Limit total individuals who can be shared toMakers cannot share canvas apps with more individuals than specified in the text box.
  • Limit total individuals who can be shared to is only enabled if Exclude sharing with security groups is checked.
  • Sharing rules are enforced when makers attempt to share an app. Sharing rules do not change the audience apps in an environment that are already shared with.
  • Once sharing rules are set in the Power Platform admin center it may take up to 1 hour for the latest sharing rules to be propagated in the system and enforced.

Data Policies:

  • A principal capability of Managed Environments is enforcing Data loss prevention (DLP) policies.
  • New environment filters have been introduced to the data policies page in the Power Platform admin center that will help you identify all the data policies that are applied to an environment.
  • The environment filters are exclusively available for managed environments.
  • Open the Edit Managed Environments settings panel for a managed environment. In the Data policies section, select the See active data policies for this environment.
  • The data policies page opens in a new tab and displays only the data policies applied to the managed environment.

License considerations:

  • Managed Environments represents a value-add on top of existing premium Power Platform capabilities.
  • All low code assets (apps/flows) in a managed environment become premium and can be licensed using any of the Power Platform licensing options (per user, per app/flow or pay-as-you-go) or Dynamics 365 licenses that give premium usage rights.
  • Users must have a qualifying license to access the assets.
  • During the public preview the premium license requirement for applications and flows within a managed environment is not enforced.

Please refer the docs for more info.

Note:

🙂

PCF component import error | Publisher prefix for this control does not match the prefix for the solution

If you are New or Beginner in Power Apps component framework , I recommend you to refer PCF beginner article first.

Coming back to the blog post, other day while importing a solution with a PCF component, solution import failed with following error

Publisher prefix XXXX for this control XX.XXXX does not match the prefix XXX for the solution with Id xxxx-xxx-xxxxxxx-xxxxx-xxxxxx"

Reason:

To understand the issue better, lets reproduce the issue by building a simple pcf control by following steps below:

  • Open Visual Studio Code and point to an empty folder.
  • Open a Terminal and execute following command to create a Control by name ‘Helloworld’.
 pac pcf init -n HelloWorld -ns sample -t field -npm true
  • Now build the control by running “npm run build” command.
  • Now that we have control in built state, let me push the control to my development Dataverse environment.
  • Execute “pac auth create –url https://myorg.crm.dynamics.com&#8221; to connect to Dataverse environment first.
  • Then, push the control to the connected environment using “pac pcf push –publisher-prefix sample command.
  • Above ‘pac pcf push‘ command creates a new unmanaged solution with naming convention “PowerAppsTools_{publisher_prefix}”.
    • In my case, a new solution “PowerAppsTools_sample” got created, as I mentioned the Publisher Prefix name as sample.
  • So far we could build a new PCF control and pushed to the Dataverse instance.
  • In my Development Dataverse environment, I already have a solution by name “MyCoreSolution” with Publisher ‘Rajeev’ having a prefix as ‘raj‘.
  • Now I want to include the PCF control which we just built to my “MyCoreSolution” solution and deploy to another environment.
  • So add the “HelloWorld” control to MyCoreSolution solution and export and save a zip file in your machine.
  • Open the new Environment and try importing “MyCoreSolution” solution, which we exported in previous step.
  • Import fails with following error
Solution "MyCoreSolution" failed to import: Import Solution Failed: CustomControls with Name sample.HelloWorld Import Solution Failed with following error: Publisher prefix sample for this control sample.HelloWorld does not match the prefix raj for the solution

Fix/Best Practice:

  • Hope you got an idea with above example on what’s wrong with solution import.
  • Because the PCF control prefix (i.e.,sample) is different than the “MyCoreSolution” solution’s Publisher prefix (i.e.,raj) we got the exception during import.
  • To avoid such issues, make sure your PCF control prefix matches with your Solution’s Publisher prefix, while importing the control to a different Dataverse environment.

🙂

Categories: PowerApps Tags: , ,