StyleCop error – The parameter is incorrect

Other day, I encountered following error while triggering ‘Run StyleCop‘ from Visual Studio.

StyleCop_Error

Reason & Fix:

  • In my case, I cloned a DevOps branch and ran the ‘Run StyleCop’ without building the code.
  • Since there were no executable’s, StyleCop thrown error ‘The parameter is incorrect‘.
  • Build the solution and ‘Run StyleCop’ should fix the issue.

StyleCop_Error_1

🙂

XML External Entity Injection – Fix

We have a Web Service, which accepts ‘XML’ payload from client and processes.

Recently the ‘Web Service’ underwent Penetration testing (Also called ‘Ethical hacking’) and following vulnerability has been identified:

PenTest_1

Reason:

  • Our Web Service, processes the XML payload submitted by client using the following code snippet:

PenTest_2

  • We were not having XML payload validation in place.
  • There is possibility that hacker can inject malicious content in the XML and can cause a denial of service condition, gain access to unauthorized information available on the server where the ‘Web Service’ was hosted.

Fix:

  • We could fix the issue by setting ‘XmlResolver’ property to ‘Null’.
  • External resources are resolved using the ‘XmlResolver’ provided via XmlDocument.XmlResolver property, by setting ‘XMlResolver’ to Null, XML Parser would not resolve the external resources, which prevents accessing external resources (i.e., Files in the Serer where Web Service hosted).
  • Change the ‘XmlDocument’ object instantiation as below with ‘XmlResolver’ set to ‘Null’ solved our issue.

var xmlDocument = new XmlDocument { XmlResolver = null };

🙂

Azure DevOps – Unable to ‘Pull’ changes from Visual Studio

I have an Azure DevOps Repo cloned to VS 2017 and ‘Pull’ command was not download and merging the remote repo.

‘Pull’ command update the code in your local repo with the changes from other members of your team. To know more about ‘Pull’ command refer here

Following steps worked as workaround for me.

  • Download and install the Git tools from here
  • From the Visual Studio’s ‘Team Explorer’, go to ‘Sync’.

git_5

  • From ‘Actions’ menu select ‘Open Command Prompt’, which opens up ‘Command Prompt’ where we can run ‘git’ commands.

git_

  • Run the command ‘git pull origin master‘ to fetch and merge the chages of other team members from the default repo ‘Master’. Change the repo name if you want.
  • System would fetch all the code commits by your team members and merge to your local repo.

git_3

  • Run the command again to make sure all the commits were merged to your local repo. You would get ‘Already up to date.’ message as below.

git_4

🙂

Categories: Azure Tags: , ,

[Step by Step] Using TypeScript in Dynamics 365 9.x

What is TypeScript:

  • TypeScript extends JavaScript by adding types to the language. TypeScript sits as a layer on-top of JavaScript, this layer is the TypeScript type system.
  • In a layman terms, TypeScript allows C# way of writing jScript.

In this article, lets see how to leverage TypeScript capability while writing client scripts in Dynamics 365.

Key Notes:

  • TypeScript’s file extension is ‘.ts‘.
  • Dynamics 365 cannot understand TypeScript, it only understands jScript (i.e., .js)
  • So we write the script in TypeScript, and use ‘Transpile’ technique to convert TypeScript to jScript.
    • Transpiling is taking source code written in one language and transforming into another language. In our case its from .ts to .js
    • Don’t worry about Transpiling as Visual Studio will take care of this.

Now we know the basics of TypeScript, lets get started.

Setting up project in Visual Studio:

  • Create a new Web Site project in Visual Studio.

TS_4

  • As we are going to Transpile TypeScript files to jScript files, create a new folders ‘ts’ and ‘js’ for code manageability.
    • ‘ts’ folder is where we write the TypeScript, ‘js’ folder is where the transpiled jScript resides.

TS_5

  • Next, to transpile TypeScript to jScript, add a new json file with name ‘tsconfig.json‘.

TS_6

  • Paste following statements and save.

{
“compileOnSave”: true,
“compilerOptions”: {
“outDir”: “js”,
“target”: “es5”,
“sourceMap”: true
},
“include”: [
“./ts/**/*”
]
}

  • Your ‘tsconfig.json’ should look like below.

TS_7

  • If you observe the ‘tsconfig.json’ statements, they state that whenever you save ‘.ts’ file compile (i.e., compileOnSave’) and save the transpiled ‘.js’ file under ‘js’ folder (i.e., outDir).
  • Next, lets install the ‘xrm’ npm package for intellisence to your project. Refer this article for more details.

TS_8

  • Run ‘–save @types/xrm‘ command.

TS_9

  • You would see a new folder ‘node_modules’ with ‘index.d.ts’ file.
    • If you are wondering what is ‘index.d.ts’, ‘*.d.ts’ files are used to provide typescript ‘Type’ information.

TS_10

Write your first TypeScript file:

Now that we got the project setup, lets write a simple ‘TypeScript’ file with ‘onload’ function on ‘Contact’ entity.

  • Add a new file ‘contact.ts’ under ‘ts’ folder.
  • Lets adding following basic structure contains ‘interface’ and ‘onload’ function.
    • A namespace is a way that is used for logical grouping of functionalities. It allows us to organize our code in a much cleaner way.

TS_13

  • Now, lets extend the onload function by reading Contact Name from ‘name’ field and popup.

TS_14

  • Lets extend further by registering ‘onchange’ event on ‘Contact Name’ field.

TS_15

  • Save the ‘contact.ts’ file and you would get transpiled ‘contact.js’ jScript file under ‘js’ folder.

TS_16

  • Following the TypeScript code if you want to use.

namespace Pract {
export function onload(executionContext: Xrm.Events.EventContext) {
let formContext = executionContext.getFormContext();

// Define ‘String Attribute’ variable to read Contact Name Attribute
let attrContactName: Xrm.Attributes.StringAttribute;
attrContactName = formContext.getAttribute<Xrm.Attributes.StringAttribute>(“firstname”);

// Register onchange event
attrContactName.addOnChange(this.contactName_onchange);

// Read Attribute Value
let contactName = attrContactName.getValue();

var alertStrings = { confirmButtonLabel: “Yes”, text: contactName, title: “Onload Event” };
var alertOptions = { height: 120, width: 260 };
Xrm.Navigation.openAlertDialog(alertStrings, alertOptions).then(
function success(result) {
console.log(“Alert dialog closed”);
},
function (error) {
console.log(error.message);
}
);
}

function contactName_onchange(executionContext: Xrm.Events.EventContext) {
alert(“OnChange triggered!”);
}
}

Register and test the script:

  • Add the transpiled ‘contact.js’ file as web resource to Dynamics application.
  • On ‘Contact’ form register the ‘onload’ event. Convention should be ‘namespace.functionname’ (i.e.,Pract.onload).

TS_1

  • Open the ‘Contact’ record and you would get the ‘Name’ value as pop-up.

TS_3

🙂

 

Categories: Dynamics 365 Tags: ,

Power Apps component framework (PCF) – Beginner guide

March 21, 2020 5 comments

Off late, I have been receiving lot of requests to explain the nitty-gritty of Power Apps component framework aka PCF.

While there are tons of great articles and videos on PCF are already available, I decided to provide comprehensive PCF beginner guide along with steps to reuse the pre-built controls.

Topics to be covered in this article:

  • What is PCF?
  • How we survived Pre-PCF era?
  • Pre-requisites
  • Understand the file structure of PCF control
  • Get familiar with commands
  • Build a simple PCF control.
  • Run and Debug in Browser
  • Packaging PCF control to a Solution
  • How to download and use the sample pcf controls
  • Add components to Model Driven App

Lets get started and dive in to the details.

What is PCF?

  • With PCF we can provide enhanced user experience for the users to work with data on forms, views, and dashboards.
  • For example, a basic ‘Whole Number’ field can be turned in to a Slider control where user can slide to set the value rather than typing.
  • Couple of important points consider
    • PCF works only on Unified Interface and not on the web client.
    • PCF doesn’t work for on-premises instances.

How we survived Pre-PCF era?

  • As mentioned in previous section, with PCF we can achieve the enhanced user experience.
  • Before PCF, with the help of HTML web resource we could enhance the user experience. As an example, we can build a ‘Slider’ control in HTML page with .css and add as Web Resources to the form.
  • How PCF different from web resources?
    • Unlike HTML web resources, PCF code components are rendered as a part of the same context, load at the same time as any other components, providing a seamless experience for the users.

Pre-requisites:

Now lets see the pre-requisites to build our first PCF control.

  • Microsoft Power Apps CLI
    • Power Apps CLI enables developers to create code components quickly.
    • Download form here

PCF_1

  • Get TypeScript
    • PCF component uses TypeScript language. We need to get TypeScript before we get in to development.
    • Getting TypeScript is 2 step process.

Understand the file structure of PCF control:

Before we build our first PCF control, lets understand the structure of PCF component.

  • PCF components consist of three elements:
    • Manifest
      • Manifest is the XML metadata (i.e., Name, Properties, etc..) file that defines a PCF control.
    • Component implementation
      • Implementation is nothing but building your control using the TypeScript.
      • Each code component must have a index.ts file.
      • index.ts file contain following methods
        • init (Required) – When the Dynamics form is ready, it initializes the PCF component by calling the init method.
        • updateView (Required) – If the data changes, platform calls out the updateView method.
        • getOutputs (Optional)
        • destroy (Required)
    • Resources
      • If the PCF control requires styling(i.e., CSS) it has to be added as Resource.
      • Resource file has to be defined in the Manifest.

Get familiar with commands:

We have to be familiar with Power Apps CLI commands before building our first control. All these commands need to be triggered from Visual Studio Command Prompt.

  • Initialization
    • This is the first command which creates basic folder structure of PCF control project.
    • Syntax:
      • pac pcf init –namespace <specify your namespace here> –name <Name of the code component> –template <component type>

        • –namespace: Unique namespace of your control.
        • –name: Name of the control. Below is the command to create a control by name ‘HelloWorld’
        • –template: Two types of components: field and dataset for model-driven apps. For canvas apps, only the field type
      • Ex – pac pcf init –namespace RajeevPCF –name HelloWorld –template field
  • Install Dependencies
    • Once ‘init’ sets up the basic folder, as a next step install all the PCF control dependencies using ‘npm install’ command.
    • Syntax:
      • npm install

    • Note: If you are wondering what is npm, it stands for ‘Node Package Manager’. As we installed ‘Node.js’ already as part of Pre-requisites we are able to use this now.
  • Build PCF Component
    • Once you implement the PCF component, build the code for any syntax errors.
    • Syntax:
      • npm run build

  • Solution – Init
    • Once we done with PCF component code implementation, we need to package in to a Solution which can be imported to your CDS instance.
    • ‘Init’ creates basic solution folder structure.
    • Syntax:
      • pac solution init –publisher-name <enter your publisher name> –publisher-prefix <enter your publisher prefix>

        • –publisher-name : Name of the solution Publisher.
        • –publisher-prefix : Solution prefix name.
      • Ex – pac solution init –publisher-name SamplePCF –publisher-prefix samppcf
  • Solution – Add-Reference
    • Solution – Init, creates a basic solution folder structure.
    •  To link the PCF component, to the Solution trigger ‘Add-Reference’ command.
    • Syntax:
      • pac solution add-reference –path <path to your Power Apps component framework project>

        • –path : Path of PCF component project folder.
      • Ex – pac solution add-reference –path “D:\E\Practice\PCF\Controls\HelloWorld”
  • Solution – Package
    • This is the final solution packaging step which generates zip file.
    • Syntax:
      • msbuild /t:build /restore
  • Run the PCF control in browser
    • Use ‘npm start’ command to open the PCF control in browser.
    • This helps to debug and test before we import the PCF control to CDS.
    • Syntax:
      • npm start

Build a simple PCF control:

As we installed Pre-requisites and familiarized with commands, lets start building our first PCF control.

  • Create 2 folders (i.e., Controls, Solutions) in your drive.
    • Note: You can only have a single folder but 2 folders gives me flexibility to manage.
  • Under ‘Controls’ folder, create a sub folder with your PCF control name.
  • Open the ‘Developer Command Prompt for VS 2017/2019’ and point to the ‘Controls -> PCF Control name’ folder. In my case, my PCF control folder name is ‘HelloWorld’.

PCF_2

  • Run the ‘init’ command.

PCF_4

  • As we know, ‘Init’ creates a basic folder structure, if you go to the folder, you would see following files.

PCF_5

  • Now run the ‘install’ command to add all dependencies.

PCF_6

  • If you go to the folder, you would see a new ‘node_modules’ folder.

PCF_8

  • Now open the project folder in either Visual Studio or Visual Studio Code.
    • If you are using ‘VS Code’ editor, trigger ‘code‘ command from ‘Command Prompt’ which auto opens the ‘VS Code’ editor.
  • If you glance the files from explorer, you would see ‘Manifest’ file (ControlManifest.Input.xml) with metadata.

PCF_11

  • Also ‘index.ts’ file, which contains 4 methods as described in above sections.

PCF_12

    • ‘context’ parameter in ‘index.ts’ file
      • Contains the context of the form similar to plugincontext in plug-ins.
      • ‘context.parameters’  fetches the PCF control parameter defined in Manifest file. ‘sampleProperty’ is the default property name of PCF control parameter defined in Manifest.
  • I’ve added the following simple logic in both ‘init’ and ‘updateView’ methods, which reads the field value from ‘context’ and display the value in HTML DIV control.

PCF_14

  • Once complete the implementation, build the project by triggering ‘npm run build‘ command.

PCF_13

Run and Debug in Browser:

  • If no errors during build, use ‘npm start‘ command to open the PCF control in browser.
  • This is a handy option which allows to debug locally in browser before ported to CDS.

PCF_9

  • npm start’ opens the PCF control in a new browser tab as below.

PCF_10

  • Use ‘F12’ to debug the functions (i.e., init, updateView, etc.).

Packaging PCF control to a Solution:

Now that we built and tested the PCF control, lets package it in to a solution which enables us to transport the PCF control to CDS which can be further used in Forms, Dashboards.

  • In the ‘Solution’ folder which has been created in above steps, create another sub folder for the PCF control. In my case, I named the folder as ‘PCF_HelloWorld’.
  • Open the ‘Developer Command Prompt for VS 2017/2019’ and point to the ‘Solutions -> PCF_HelloWorld’ folder.
  • Trigger ‘solution init‘ command which creates folder structure.

PCF_15

  • Trigger ‘solution add-reference‘ command, which links our PCF control with Solution.

PCF_17

  • Finally, trigger ‘msbuild‘ command to zip the components.

PCF_18

  • msbuild‘ creates ‘bin/Debug’ folder along with solution zip file.

PCF_19

  • Once we have the solution zip file, we are good to use the PCF control by importing to CDS. This is explained in coming sections.

How to download and use the sample pcf controls:

As we learnt how to build our own PCF control, lets see how to use the pre-built PCF controls. I am going to leverage increment-control to from sample components.

  • Open the link and scroll to the end. Click on ‘Download sample components’ link, which prompts you to download a zip folder contain all the sample controls.

PCF_24

  • Save the zip to drive, extract and copy to our ‘Controls’ folder which we created in previous sections. It looks something as below.

PCF_40

 

  • As we are going to use ‘TS_IncrementComponent’ control, we need to run the NPM commands to build and package ‘TS_IncrementComponent’ control to solution (i.e., zip file).
  • Open the ‘Developer Command Prompt for VS 2017/2019’ and point to the ‘Controls -> TS_IncrementComponent’ folder.
  • Note: We don’t need to run the ‘Init’ command as the folder structure is already formed from the download.
  • Run the ‘npm install‘ command to install all the required dependencies.

PCF_27

  • Note: We don’t need to run ‘build’ command as control was already built. So we can directly start solution packaging.
  • Create a new sub folder (i.e., ‘PCF_Increment’ is my folder name) under ‘Solutions’ folder and point the ”Developer Command Prompt for VS 2017/2019′ to this new folder.
  • Run ‘solution init’ command to start packaging.

PCF_28

  • Run ‘solution add-reference’ command to link PCF control to Solution.

PCF_29

  • Finally, run ‘msbuild’ command to create solution .zip folder.

PCF_30

  • Go to folder and you should see a new ‘bin/Debug’ folder as follow with the solution zip file.

PCF_31

  • As we got the solution zip file, lets see how to use the PCF control in our Dynamics forms.

Add components to Model Driven App

In this section, lets see how to transport the PCF control solution zip file created in above section to CDS.

PCF_31

Prerequisites:

  • Subscribe to Dynamics 30 days trial and get Office 365 account.
  • Connect to Power Apps maker portal using Office 365 account.

Add PCF component solution to CDS:

  • To import the solution to CDS, first connect to the Power Apps maker portal
  • Go to ‘Solutions’ tab.
  • Click on ‘Import’ and browse the solution zip file (i.e., PCF_Increment.zip) and complete the import process.

PCF_33

  • Publish customizations.

Add PCF control to Entity:

As we imported ‘TS_IncrementComponent’ control, which works with ‘Whole Number’ field, create a ‘Whole Number’ field and configure the PCF control.

  • Create a new ‘Whole Number’ field in one of the entities.

PCF_35

  • Open the ‘Main’ customization form in legacy editor.
    • Note: At the time of writing this article, PCF controls can be configured only from legacy UI.
  • Click on field, go to ‘Controls’ tab and click on ‘Add Control…’.

PCF_36

  • Select the ‘IncrementControl’ from the list.

PCF_37

  • Save and Publish the form
  • Open the form in Model driven app and you should see the PCF control in the place of ‘Whole Number’ field as below.

PCF_38🙂

 

 

 

 

 

D365 Data Export Service – Unable to connect to Azure SQL Server from Profile

March 14, 2020 1 comment

Other day while configuring Data Export Service to replicate the CDS data to Azure SQL Server’s Database, I encountered following error from the ‘Profile’ creation step:

Unable to connect to the Destination mentioned in the KeyVault URL

DES_22

Troubleshooting steps and Fix:

From the error message its clear that, Data Export Service unable to connect to the Destination which is Azure SQL Server’s DB in my case. Following checks helped me to resolve the issue.

  • Make sure you copy the Azure SQL DB connection string using the ‘Copy’ option. This is super important when you are creating the ‘Key vault URL’.

DES_12

  • Post the Azure Key vault generation, validate the ‘Secret value’ from Azure portal.

DES_28

  • Test the Azure SQL DB connection from either SSMS or .udl file using the same ‘User ID’ and ‘Password’ specified while generating the Azure Key Vault URL.

DES_27

  • Make sure following ‘Firewall settings’ made to your Azure SQL server.
    • Open the ‘SQL Server’ from Azure Portal and click on ‘Show firewall settings’
    • DES_26
    • Make settings as below
    • DES_25
  • Now try validating the Data Export Service’s ‘Key Vault URL’ from the ‘Profile’ and it should work.

DES_24

Refer my Step by step configuring Data Export Service article on the usage of Data Export Service.

🙂

[Step by Step] Data Export Service – Replicate CDS data to Azure SQL Server Database

March 14, 2020 1 comment

For unversed, The Data Export Service (DES) is an add-on service available on AppSource that adds the ability to replicate data to target destinations such as Azure SQL Database and SQL Server on Azure virtual machines.

In this article, lets see how to replicate CDS data to Azure SQL Server Database.

Prerequisites:

  • Subscribe to Dynamics 30 days trial and get Office 365 account.
  • Connect to Power Apps maker portal using Office 365 account and create a simple Model Driven App with ‘Account’ entity.
  • Install ‘Data Export Service’ add-on from AppSource.
  • Subscribe to Azure 30 days trial using the same Office 365 account.

DES_5

  • Azure SQL Server and Database.
  • Generate Azure Key Vault URL using Power Shell.

Installing ‘Data Export Service’ from AppSource:

  • From your Model Driven App, go to Settings -> Microsoft AppSource
  • Search for ‘Data Export Service’ and click on ‘Get it now’.

DES_1

  • In the next steps, select your Organization and click on ‘Agree’.

DES_3

  • Wait for the solution to get installed.

DES_4

Configuring Azure SQL Server and Database:

  • Connect to your Azure Portal and create a new SQL Database Server  resource.

DES_6

  • Post creation, copy the ‘Server name’ and ‘Server admin’ and ‘Subscription ID’, which you would need in next steps.

DES_7

  • Now we need to create a new Database where you would replicate your CDS data to.
  • You can either create new Database from Azure Portal or from SQL Server Management Studio (SSMS). I am using SSMS to create the new Database.
  • Connect to Azure SQL server from SSMS using ‘Server name’ and ‘Server admin’ details captured in previous steps.

DES_9

  • Create a new Database and copy the name (i.e., DES in my case).

DES_10

  • Now go to Azure Portal and open the DES SQL database. Click on ‘Connection Strings’.

DES_11

  • Copy the ‘ADO.NET’ connection string which you would need in next steps.

DES_12

Generate Azure Key Vault URL using Power Shell and create Export Profile:

  • To connect to Azure SQL Server Data Export Service(DES) requires the SQL Server connection string.
  • DES will not accept the plain connection string.
  • DES accepts only Azure key vault URL with connection string stored as Secret.

As a standard practice, we configure Azure Key Vault URL using PowerShell script using following steps:

  • From the Model Driven App, go to Settings -> Data Export
    • Note: Make sure you disable pop-up blockers.
  • Click on ‘+New’ to create ‘Export Profile’.

DES_13

  • From the pop-up, fill the details other than ‘Key Vault URL’ and click on information icon.

DES_14

  • Copy the PowerShell helper script from the pop-up.

DES_15

  • Now open the ‘Windows PowerShell’ and open a new window.
  • Paste the PowerShell helper script copied from previous step.
  • In the ‘PLACEHOLDER’ section provide the details specific to your Azure subscription.
    • $subscriptionId – Use the copied value from previous steps.
    • $keyvaultName – Your desired name.
    • $secretName – Your desired name.
    • $resourceGroupName – Either use existing or new. If a resource group doesn’t already exist a new one will be created.
    • $location – provide the location (i.e., East US/West US/…)
    • $connectionString – Use the copied value from previous steps.
    • $organizationIdList – Dynamics organization ID. From the Model Driven App, go to Settings > Customizations > Developer Resources. The organization Id is under ‘environment Reference Information’.
    • $tenantId – You can get it from ‘Azure Portal -> Azure Active Directory -> App registrations -> Endpoints‘ by copying the highlighted value as below.
    • DES_16
  • Once all the required details provided, your PowerShell looks as below.

DES_17

  • Run the script which creates a new Azure Key Vault. You should get output as below.

DES_18

  • Now that the ‘Key Vault URL’ is formed, copy the value from Azure Portal -> Key vaults -> Secrets’

DES_19

  • Go back to the DES ‘Data Export Profile’ pop up and paste the ‘Key Vault URL’.
  • Click on ‘Validate’ to make sure the connection works.

DES_24

  • Click ‘Next’ and pick your CDS entities which would want to replicate to Azure SQL.

DES_29

  • As a last step click on ‘Create & Activate’ to complete the set up.

DES_30

  • Give it sometime and you should see the Metadata tables and ‘Account’ entity tables created in your target (i.e., Azure SQL Server).

DES_31

Notes:

  • Data Export Service can be used with model-driven apps in Dynamics 365, such as Dynamics 365 Sales and Customer Service.
  • Only entities that have change tracking enabled can be added to the Export Profile. Enable change tracking to control data synchronization
  • To use the Data Export Service the model-driven apps in Dynamics 365 and Azure Key Vault services must operate under the same tenant and within the same Azure Active Directory.
  • The Data Export Service does not drop (delete) the associated tables, columns, or stored procedure objects in the destination Azure SQL database when the following actions occur and these items must be dropped manually.
    • An entity is deleted.
    • A field is deleted.
    • An entity is removed from an Export Profile.

Refer article to know more about Data Export Service. To troubleshoot connectivity issues refer article

🙂

Canvas App -Working with Bing Maps connector

In this article lets understand the basics of ‘Bing Maps‘ connector to locate the addresses in Canvas App.

Prerequisites:

  • Subscribe to Dynamics 30 days trial and get Office 365 account.
  • Connect to Power Apps maker portal using Office 365 account to build the Canvas App.
  • Bing Map API Key

How to get Bing Map API Key:

  • Connect to Bing Maps Portal
  • ‘Sign In’ using either ‘Microsoft Account’ or ‘Enterprise Azure Active Directory account’.

bm_1

  • Go to ‘My account -> My Keys’ and create a new key. (Steps to create new key)
  • Copy the Key which we gonna use in next steps.

bm_2

Adding ‘Bing Maps’ connector to Canvas App:

  • Create a new Canvas App.
  • Add a ‘Bing Maps’ connector.
  • Provide the ‘API Key’ captured previously.

bm_3

Locate address using ‘Bings Maps’ connector:

  • For better understanding, I added below controls to my Canvas app’s screen.
    • 3 Text box controls to capture address
    • An Image control which loads the map
    • A Button to load the map.
  • On Button ‘OnSelect’, Declare a global variable ‘varAddress‘ and use BingMaps.GetLocationByAddress API to convert the address to coordinates (i.e., Latitude and Longitude).
    • Set(
      varAddress,
      BingMaps.GetLocationByAddress(
      {
      addressLine: txtAddress1.Text,
      locality: txtAddress2.Text,
      postalCode: txtZip.Text
      }
      )
      );

bm_4

  • Now use the coordinates from ‘varAddress‘ variable and load the map in Image control.
  • Set ‘Image’ property to BingMaps.GetMap() by setting coordinates from ‘varAddress’.
    • BingMaps.GetMap(
      “AerialWithLabels”,
      15,
      varAddress.point.coordinates.latitude,
      varAddress.point.coordinates.longitude)

bm_5

  • Run the App and you should see the address located in Map.

bm_6

  • You can also show a ‘Push Pin’ using ‘pushpin’ property. You need to pass coordinates as below.
    • BingMaps.GetMap(
      “AerialWithLabels”,
      15,
      varAddress.point.coordinates.latitude,
      varAddress.point.coordinates.longitude,{pushpin:varAddress.point.coordinates.latitude&”,”&varAddress.point.coordinates.longitude}
      )

bm_7

🙂

Categories: PowerApps Tags: , ,

PL-900: Microsoft Power Platform Fundamentals – Prep Notes

February 25, 2020 Leave a comment

There was an ask form a Power Platform beginner in my blog about the preparation for the PL-900 exam.

As I attained the certification recently, I am going to share the topics I covered for my preparation in this article.

***This article is strictly for guidance purpose and by no means I intend to post the questions from the exam***

Areas I covered during my preparation:

  • Power Platform Environments
    • Types of Environments
    • Use of Default Environment
    • Built-in Roles – Admin Role, Maker Role
  • Canvas Apps vs Model Driven Apps
    • Understand that ‘Model Driven Apps’ can only consume CDS data.
    • You need Canvas app for other types of data.
  • Data Connectors
    • Tabular data and Functional-based data connectors
  • Power Automate Flow
    • How you post CDS data to SAP/Saleforce/etc
    • Integrate Flow with Canvas and Model Driven Apps.
    • How to test the Flow.
  • Power Apps Portals
    • Understand the basics of Web Page, Web Role, Web Templates.
  • AI Builder
    • Understand what you can achieve with the 4 models.

PL900_1

  • Power BI
    • Power BI Desktop vs Power BI Service : What can and can’t be done with each.
    • Data Modeling.
    • How to create and share Dashboards.

If any further questions or need guidance feel free to leave a comment.

🙂

 

Categories: CRM, PowerApps Tags:

ADX Portal – Forgot Password – ‘Invalid Party Object Type’ Error

February 21, 2020 1 comment

Few of our ADX portal users encountered an unhandled exception while resetting the Password using native ‘Forgot Password’ feature.

ADX_FgtPwd_1

Reason:

  • Its a sporadic issue only with few portal users.
  • When checked the ‘Event viewer’ logs on Portal’s web server, there was this following warning with ‘Invalid Party Object Type 9’ exception message.

ADX_FgtPwd_2

  • In CRM, ‘Object Type 9’ denotes OOB ‘Team’ entity and the issue turns out to be with the ‘From’ field of ‘Send Password Reset To Contact‘ ADX process.
  • In our requirement, we were setting ‘From’ field of ‘Send Password Reset To Contact’ process to owner of the ‘Portal User’ (i.e.,Contact).
  • Portal user’s (i.e.,Contact) who owned by teams encountering this issue while triggering ‘Forgot Password’. As Email can’t be delivered when a ‘Team’ set as ‘From’.

Fix:

  • Modified the ‘Send Password Reset To Contact‘ process by setting ‘From’ field to a ‘System User’ with Mail Box enabled. This made sure no ‘Team’ renders in ‘From’ field.

ADX_FgtPwd_3

🙂