Archive

Author Archive

Azure DevOps (ADO) | Pipeline failure | You need the Git ‘GenericContribute’ permission

September 17, 2021 Leave a comment

While pushing the code to Repo using ‘Command Line Script’ task in ADO pipeline, got the following permission issue.

  • Following is the script used in ‘Command Line Script’ task to push the code to main branch.
echo commit all changes
git config user.email "rajeevpentyala@live.com"
git config user.name "Rajeev Pentyala"
git checkout main
git add --all
git commit -m "solution init"

echo push code to new repo
git  -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin main

Reason for the issue:

  • Account under which the ADO pipeline running, does not have required permissions to Push the code.

Fix:

  • Go to Settings -> Repositories
  • Under ‘Security’ tab, select the ‘ALM Build Service(User_Name)‘ and grant highlighted privileges.

🙂

Dataverse | New Text Formats | json, richtext

September 16, 2021 Leave a comment

Text in Dataverse is a data type that can store a max of 4000 characters. Text has multiple formats that instruct the UI to treat it differently.

As an example, Email is a text format tells the client to treat the contents of the field as an email. It can display the data as a link that, when clicked, launches your default email client and inserts the address in the To: field.

There are 2 new formats, json and richtext have been introduced.

Json 

  • json format will store text strings in a json format.
  • This format will not perform any validations on the correctness of the json. It simply allows you to store, view, and retrieve the content with json markup.
  • This is currently limited for use on non-SQL tables (like Data Lake).

Richtext 

  • richtext format will allow the use of markup tags to format your text when viewed in a compatible control.
  • By setting this value you can enable richtext for any other text or multiline text column.
  • A control is coming soon in the Canvas and Model Driven space that will be used whenever a column indicates it is richtext.

Please refer this article for more details.

🙂

Categories: CRM

Azure DevOps (ADO) | Pipeline failure | Failed to connect to Dataverse

September 16, 2021 Leave a comment

One of my ADO pipelines ‘Power Platform Publish Customizations’ task failed with “Failed to connect to Dataverse” error.

Reason:

  • ‘Power Platform Publish Customizations’ task’s ‘Authentication type’ was selected as ‘Username/password’ which does not have MFA support.
  • MFA (Multi Factor Authentication) was enabled on the Environment, which I was trying to connect from the Pipeline.
  • Since the ‘Power Platform Publish Customizations’ task’s ‘Authentication type’ was selected as ‘Username/password’ which does not support MFA, pipeline could not connect to the Dataverse environment.

Fix:

  • Create an ‘Application User’ by completing App Registration in Azure Active Directory and grant a Security Role.
  • In the ADO pipeline’s ‘Power Platform Publish Customizations’ task, select ‘Authentication type’ as ‘Service Principal’.
  • Make sure the ‘Service Connection’ configured properly with Azure App Registration details (i.e., Tenant ID, Application ID and Client Secret).
    • My connection name is ‘SP_ExpAugust21’ and details are as follows.
  • Save and run the pipeline and it should work now.

🙂

Azure DevOps (ADO) | Pipeline failure | Could not get the latest source version

September 15, 2021 1 comment

I’ve created a new ADO project and configured a Pipeline to export Power Apps solution. While running the Pipeline it failed in immediately with following exception.

The pipeline is not valid. Could not get the latest source version for repository…

Reason:

  • Under the ‘Get sources’ step of the Pipeline, ‘Default branch for manual and scheduled builds’ was auto selected as master.
  • However my existing Branch Name was main.
What is the difference between master and main branches:
  • In ADO, default ‘Branch Name’ used to be master until October 2020. Post that default ‘Branch Name’ changed to main.
  • This Branch name change was not taken affect in Pipelines. New Pipeline defaults the Branch Name to master which is invalid. It has to be main.
  • Refer this ADF product blog for more details.

Fix:

  • In the ‘Get sources’ step of the pipeline, change the Default Branch from master to main.
  • Save and Rerun the pipeline.

🙂

Categories: Azure, CRM Tags: , , ,

Dataverse Web API | JScript | EDM.Date conversion issue

September 15, 2021 Leave a comment

While triggering ‘Create’ Action from jScript using the Web API we were getting following Edm.date conversion exception:

An error occurred while validating input parameters: Microsoft.OData.ODataException: Cannot convert the literal ‘2021-09-22T18:30:00’ to the expected type ‘Edm.Date’. —> System.FormatException: String ‘2021-09-22T18:30:00’ was not recognized as a valid Edm.Date.

Reason:

  • In the Create Request payload, there was a Date value which was causing Edm.Date conversion issue.
  • Following is the code snippet used to create a record for the custom table ‘raj_jobhistory’:
var entity = {};
entity.raj_companyname = "Microsoft";
entity.raj_startdate = Date.now();
entity["raj_employer@odata.bind"] = "/raj_employers(8D34D0F4-3F11-EC11-B6E6-000D3A3BA21F)";

Xrm.WebApi.online.createRecord("raj_jobhistory", entity).then(
    function success(result) {
        var newEntityId = result.id;
    },
    function(error) {
        Xrm.Utility.alertDialog(error.message);
    }
);
  • In the above script, ‘raj_startdate’ is ‘Date only‘ field which was being set to ‘Date.now()’.
  • For ‘Date only‘ field, Web API accepts the ‘yyyy-mm-dd‘ format. Since the date format was not being sent as yyyy-mm-dd application throwing the exception.

Fix:

  • Use toISOString() function to change the date value format to yyyy-mm-dd.
  • Following is the modified script.
var entity = {};
entity.raj_companyname = "Microsoft";
// Use toISOString() to change the Date format to yyyy-mm-dd
entity.raj_startdate = Date.now().toISOString().split('T')[0];
entity["raj_employer@odata.bind"] = "/raj_employers(8D34D0F4-3F11-EC11-B6E6-000D3A3BA21F)";

Xrm.WebApi.online.createRecord("raj_jobhistory", entity).then(
    function success(result) {
        var newEntityId = result.id;
    },
    function(error) {
        Xrm.Utility.alertDialog(error.message);
    }
);
Notes:
  • EDM stands for Entity Data Model. OData service uses an abstract data model called  (EDM) to describe the exposed data in the service.
  • Refer this article on the usage of Dataverse Web API.

🙂

Categories: PowerApps

[Quick Tip] Short cut to open ‘Command Prompt’ pointing to a folder

September 6, 2021 Leave a comment

In this article, lets learn a shortcut to open Command Prompt pointing to a folder from a Windows OS machine.

Regular Approach:
  • To open a folder from a different drive than C, we use the following approach to point to the folder.
  • Above example, I am pointing the Command Prompt to “D:\E\Practice\S2S\packages\Microsoft.CrmSdk.XrmTooling.PluginRegistrationTool.9.1.0.24\tools” folder using ‘cd’ command.

Shortcut:
  • Go to the folder you want to point the Command Prompt to, and hit cmd.
  • ‘Command Prompt’ window opens up pointing to the folder you hit the cmd from.

🙂

Categories: Misc Tags: ,

Power Platform Tools | Developer Toolkit for Visual Studio 2019

September 1, 2021 2 comments

Power Platform Tools for Visual Studio supports the rapid creation, debugging, and deployment of plug-ins.

You may note that Power Platform Tools for Visual Studio is similar in appearance and function to the Developer Toolkit for Microsoft Dynamics CRM 2013.

While Power Platform Tools for Visual Studio is similar in appearance and function to the Developer Toolkit for Microsoft Dynamics CRM 2013, Power Platform Tools is a new product and completely independent of the Developer Toolkit.

Power Platform Tools is not directly compatible with any templates or projects from the Developer Toolkit and vice versa.

Steps to enable ‘Power Platform Tools’ extension in VS 2019:

  • From the Visual Studio 2019, click on ‘Extensions -> Manage Extensions’.
  • Expand the left navigation panel node Online > Visual Studio Marketplace. Search for “Power Platform Tools”, then click on ‘Download’.
  • Post Download, close all the Visual Studio instances and wait for few seconds, you would get below installer screen.
  • Click ‘Modify’ and complete the installation.
  • Post installation, open the Visual Studio and initiate ‘Create New Project’.
  • You should find ‘New Visual Studio Solution Template for Dynamics 365’ solution template.
  • Provide Project Name and click ‘Create’ which will open up the familiar ‘Developer Toolkit’.
Note:
  • After installing Power Platform Tools, you will not find any Power Platform Tools related menu items or views in the Visual Studio user interface until you create or load a Visual Studio solution that contains at least one project created from a Power Platform Tools template.

Refer Microsoft docs for detailed steps to install Power Platform Tools on Visual Studio.

🙂

Tip | Model Driven Apps | Client API | setSharedVariable and getSharedVariable

As we know Client-side scripting using JavaScript is one of the ways to apply custom business process logic for displaying data on a form in a model-driven app, In this article lets understand how to pass variables between event handlers (i.e., Different jScript functions registered as event handlers).

Lets understand this by first understanding the Form event pipeline.

Form event pipeline:

  • We can define up to 50 event handlers for each event. Each event handler is executed in the order that it is displayed in the Event Handlers section in the Events tab of the Form Properties dialog box.

setSharedVariable:

  • Sets the value of a variable to be used by a handler after the current handler completes.
  • Syntax : ExecutionContextObj.setSharedVariable(key, value);
    • Ex : ExecutionContextObj.setSharedVariable(“sharedAccountName“, formContext.getAttribute(“name”).getValue());

getSharedVariable:

  • Retrieves a variable set using the setSharedVariable method.
  • Syntax: var sharedVariable = ExecutionContextObj.getSharedVariable(key);
    • Xrm.Navigation.openAlertDialog({ text: ExecutionContextObj.getSharedVariable(“sharedAccountName“) });

🙂

Code Snippet | JScript | Capture BPF Stage Parameters

Below is code snippet to capture the BPF stage parameters up on the ‘Stage’ change using jscript.

function onload(executionContext) {
    var formContext = executionContext.getFormContext();
    // Register event handler on Process Stage Change
    formContext.data.process.addOnStageChange(onStageChange);
}

function onStageChange(executionContext) {
    var formContext = executionContext.getFormContext();
    var stage = formContext.data.process.getSelectedStage();
    // Get Stage Name
    var currStageName = stage.getName();
    // Show Stage Name
    Xrm.Navigation.openAlertDialog({ text: "Current StageName - " + currStageName });

    // Get Stage ID
    var stageID = stage.getID();
    // Get Entity Name
    var stageEntityName = stage.getEntityName();
    // Get Stage Status
    var stageStatus = stage.getStatus();
}
  • Save this script as web resource and register both the ‘onload’ and ‘onStageChane’ functions on form’s onload event.
  • Below is the example of script on Opportunity form.

🙂

Categories: PowerApps Tags: , ,

[Step by Step] Dataverse | Connect Cloud flow with Service Principal (Application User)

By default, Cloud flow Dataverse connectors run under the Owner (i.e., User who created the flow) context. When the flows move to different environment via solutions, connectors run under the user account who imported the Solution.

Making the flows run under interactive user accounts is not recommended as they cause confusion when we check the record’s audit for who updated the record. Its recommended to make the flow run under ‘Application User’, if the calling user can be a fixed account.

In this article lets see how to make the flow run under Application User using Connect the flow using Service Principal option.

High level design:

Following are the steps we gonna go through.

  • App registration in Azure Active Directory (AAD)
  • Create an Application User in Environment.
  • Create a Cloud Flow and connect with Application User.

App registration in Azure Active Directory (AAD)

  • Add a Secret and save the Secret.
  • Copy the Application ID and Tenant ID.
  • Refer this article for the detailed ‘App Registration’ steps.

Create an Application User in Environment

  • Click on ‘New app user’ and select ‘Business Unit’ and ‘Security Role(s)’.
  • Click on ‘Add an app’ and select the App registered in previous section.
  • You should see the ‘Application user’ listed as below.

Create a Cloud Flow and connect with Application User:

  • Connect to Maker portal and create a new Solution.
  • Click on New -> Cloud flow.
  • Click on ‘Connect with Service Principle’.
  • Provide the details captured in Azure Active Directory ‘App Registration’ section and click ‘Create’.
  • Now you should see that in ‘Connection references’ as below.
  • If you go back to the ‘Solution’, you would see a new entry ‘Connection Reference (preview)’ along with the flow.
  • With the ‘Connection Reference (preview)’, we can conveniently move flow to different environment using Solution export and import.
  • Lets proceed and complete the flow, which creates a ‘Contact’ record upon the creation of an ‘Account’.
  • Create an ‘Account’ from the ‘Customer Service Hub’ App.
  • A ‘Contact’ gets created triggered from the flow and Owner would the ‘Application User’.

Notes:
  • You can use ‘Run as’ option and make the ‘Dataverse’ run under one of the highlighted User contexts.

🙂