Archive

Archive for the ‘CRM’ Category

Power Platform Admin Analytics

September 27, 2020 Leave a comment

In previous Dynamics versions, with the help of Organization Insights, available as a preferred solution from AppSource, we can get the organization level insights such as No of Active Users, API Calls, etc.

Now  these analytics can be viewed right from the Power Platform Admin Center with no need of additional solutions.

Currently there are 3 types of Analytics are available.

Who can view the Analytics reports:

Admins with the following roles and a license can view the analytics:

  • Environment Admin – can view reports for the environments that the admin has access to.
  • Power Platform admin โ€“ can view reports for all environments.
  • Dynamics 365 admin – can view reports for all environments.
  • Microsoft 365 Global admin โ€“ can view reports for all environments.

Refer this article for more details

๐Ÿ™‚

Categories: CRM, PowerApps Tags: ,

[Code Snippet] Using Microsoft.Pfe.Xrm library

September 20, 2020 Leave a comment

Pfe.Xrm library allows us to submit a bunch of XRM requests as batches (using ExecuteMultipleRequest) and execute them in parallel.

In this article, lets see how to use Microsoft.Pfe.Xrm library with a simple ‘Retrieve Request’.

What is Microsoft.Pfe.Xrm library?

  • Contains a set of common components for building solutions using the Dynamics 365 SDK and is developed by Microsoft Premier Field Engineering (PFE) and core engineering teams.

Download Microsoft.Pfe.Xrm

  • Use ‘Tools -> Nuget package manager’ from your Visual Studio to download the Nuget package.

Building blocks of Microsoft.Pfe.Xrm library

‘Microsoft.Pfe.Library’ allows us to trigger any ‘OrganizationRequest’ in batches (i.e.,ExecuteMultipleRequest) parallelly. Lets understand the building blocks of Pfe library.

  • List<OrganizationRequest>
    • As a first step prepare your ‘RetrieveMultipleRequest’ objects and add them to a List.
  • AsBatches() method
    • Call Pfe libraries AsBatches() method by passing the List prepared in previous step.
    • ‘AsBatches()’ returns IDictionary<string, ExecuteMultipleRequest> object.
  • OrganizationServiceManager
    • Trigger ‘Microsoft.Pfe.Xrm’ OrganizationServiceManager() method to create connection object.
    • There are multiple ‘OrganizationServiceManager’ overload methods.
    • In this article, I am using the following method.
  • ParallelProxy.Execute
    • This is the final method which accepts the ‘IDictionary<string, ExecuteMultipleRequest>’ object returned by AsBatches() method which we discussed in previous sections.
    • ParallelProxy.Execute also requires OrganizationServiceManager which we created in previous step.

IDictionary<string, ExecuteMultipleResponse> batchResponses = pfeXrmConnection.ParallelProxy.Execute(batchesOrgRequest);

Code Snippet

Now lets put all the building blocks together. It becomes as the following code snippet.

    public void RetrieveUsingPFE()
    {
        // Retrieve 'Account' and 'Contact'. You can prepare your list of entities.
        var listEntities=new List<>{ "account", "contact" };
        var batchRequests = new List<OrganizationRequest>();            

        int pageNumber = 1;
        string pagingCookie = null;

        // Filter for Active records. You can prepare your required filters as below.
        var filterActive = new FilterExpression(LogicalOperator.And);
        var condActive = new ConditionExpression("statecode", ConditionOperator.Equal, 0);
        filterActive.AddCondition(condActive);

        try
        {
            foreach (var entity in listEntities)
            {
                // Prepare 'Query Expression'
                var query = new QueryExpression(entity)
                {
                    ColumnSet = new ColumnSet(true),
                    PageInfo = new PagingInfo()
                };
                query.PageInfo.Count = 5000;
                query.PageInfo.PageNumber = pageNumber;
                query.PageInfo.PagingCookie = pagingCookie;

                query.Criteria.AddFilter(filterActive);

                var request = new RetrieveMultipleRequest
                {
                    Query = query
                };

                batchRequests.Add(request);
            }

            do
            {
                // Trigger Pfe.Execute
                var batchResponses = TriggerRequests(batchRequests);
                batchRequests = new List<OrganizationRequest>();
                pageNumber++;
                foreach (var responseItem in batchResponses)
                {
                    if (responseItem.EntityCollection.Entities.Count > 0)
                    {                            
                        var currEntityName = responseItem.EntityCollection.Entities[0].LogicalName;

                        foreach (var record in responseItem.EntityCollection.Entities)
                        {
                            // 'record' is the Retreieved record
                            // Perform your desired operations
                        }
                    }

                    if (responseItem.EntityCollection.MoreRecords)
                    {
                        var query = new QueryExpression(responseItem.EntityCollection.EntityName)
                        {
                            ColumnSet = new ColumnSet(true)
                        };

                        query.Criteria.AddFilter(filterActive);

                        var request = new RetrieveMultipleRequest();
                        query.PageInfo = new PagingInfo
                        {
                            Count = 5000,
                            PageNumber = pageNumber,
                            PagingCookie = responseItem.EntityCollection.PagingCookie
                        };

                        request.Query = query;
                        batchRequests.Add(request);
                    }
                }
            } while (batchRequests.Count > 0);
        }
        catch (Exception ex)
        {
            Console.WriteLine("Error occurred, error message is {0}", ex.Message);
            throw ex;
        }            
    }

    public List<RetrieveMultipleResponse> TriggerRequests(List<OrganizationRequest> batchRequests)
    {
        var retrieveRequestBatches = batchRequests.AsBatches(recordsPerRequest);
        var batchResponses = ExecuteParallelProxy(batchRequests);            
        var listResponses = new List<RetrieveMultipleResponse>();

        foreach (var key in batchResponses.Keys)
        {
            foreach (var result in batchResponses[key].Responses)
            {
                if (retrieveRequestBatches[key].Requests[result.RequestIndex] is RetrieveMultipleRequest)
                {
                    var originalRequest = (RetrieveMultipleRequest)retrieveRequestBatches[key].Requests[result.RequestIndex];
                    // Capture failed records
                    if (result.Fault != null)
                    {
                        Console.WriteLine($" Exception : {result.Fault.Message}");
                    }
                    else if (result.Response != null && result.Response is RetrieveMultipleResponse) // Capture success records
                    {
                        var responseItem = (RetrieveMultipleResponse)result.Response;
                        listResponses.Add(responseItem);
                    }
                }                    
            }
        }

        return listResponses;
    }

    private IDictionary<string, ExecuteMultipleResponse> ExecuteParallelProxy(IDictionary<string, ExecuteMultipleRequest> retrieveRequestBatches)
    {            
        IDictionary<string, ExecuteMultipleResponse> batchResponses = null;
        try
        {
            batchResponses = CRMPfeConnection.ParallelProxy.Execute<ExecuteMultipleRequest, ExecuteMultipleResponse>(retrieveRequestBatches);                
        }
        catch (AggregateException ae)
        {
            foreach (var ex in ae.InnerExceptions)
            {
                Console.WriteLine($"Aggregate Error in ExecuteParallelProxy : {ex.Message.ToString()}");
            }
        }
        catch (Exception ex)
        {
            Console.WriteLine($"Error in ExecuteParallelProxy : {ex.Message.ToString()}");
        }

        return batchResponses;
    }

    public static OrganizationServiceManager CRMPfeConnection
    {
        get
        {
            if (crmPfeConnection == null)
            {
                var UserName = ConfigurationManager.AppSettings["UserNameSrc"].ToString();
                var Password = ConfigurationManager.AppSettings["PasswordSrc"].ToString();
                var OrganizationUri = ConfigurationManager.AppSettings["OrganizationUriSrc"].ToString();
                ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
          crmPfeConnection = new OrganizationServiceManager(XrmServiceUriFactory.CreateOrganizationServiceUri(OrganizationUri), UserName, Password);
            }
            return crmPfeConnection;
        }
    }
  • Similar to ‘RetrieveRequest’, you can trigger all kinds of requests.

๐Ÿ™‚

Power Apps | CDS | Track deleted records using ‘Change Tracking’

For unversed, Change Tracking in CDS allows data to be synchronized by detecting what data has changed since the data was initially extracted or last synchronized.

Its useful in Integration scenarios, where you have to keep data between CDS and external systems in sync.

Assume that your customer has a LOB application along with CDS and you want sync the LOB application (i.e., When record gets Created/Updated/Deleted in CDS), this can be achieved using ‘Change Tracking’ feature.

In this article, lets understand how ‘Change Tracking’ works in CDS using ‘Organization Service’, to:

  • Get the Created or Updated records from last sync.
  • Get Deleted records from last sync.

Enabling Change Tracking for an Entity:

  • To use ‘Change Tracking’ make sure that the change tracking feature is enabled for that entity.
  • This feature can be enabled by using the customization user interface.
Enable Change Tracking

‘Change Tracking’ using the Organization Service:

  • When change tracking is enabled for an entity, RetrieveEntityChangesRequest message can be used to retrieve the changes for that entity.
  • In my example, I am using OOB ‘Account’ entity which has 10 records.
  • When ‘RetrieveEntityChangesRequest’ triggered for the first time, we would get all 10 records in ‘Changes’ object and a ‘Data Token’.
  • Copy the ‘Data Token’.
  • Now lets delete an Account and add a new Account in CDS.
  • Trigger ‘RetrieveEntityChangesRequest’ again by passing the copied ‘Data Token’.
    • Passing ‘Data Token’ in ‘RetrieveEntityChangesRequest’, fetches only data for those changes that occurred since that version will be returned.
  • This time we would get 2 records in ‘Changes’ object.
    • 1 NewOrUpdatedItem
    • 1 RemovedOrDeletedItem.
  • ‘NewOrUpdatedItem’ is object of type ‘Entity’ with all the properties. ‘RemovedOrDeletedItem’ is object of type ‘EntityReference’ contain only GUID of deleted record.
  • Following is the code snippet to track changes and display in console.
// Created or Updated records will be returned as 'Entity' object.
var createdorUpdateRecords = new List<Entity>();
// Deleted records will be returned as 'Entity Reference' object.
var deletedRecords = new List<EntityReference>();

// Retrieve records by using Change Tracking feature.
var request = new RetrieveEntityChangesRequest
{
    EntityName = "account",
    Columns = new ColumnSet("name"),
    PageInfo = new PagingInfo() { 
       Count = 5000, 
       PageNumber = 1, 
       ReturnTotalRecordCount = false 
    },
    // DataVersion is an optional attribute
    // If not passed system returns all the records. 
    // Dont pass this attribute when calling for the first time.
     DataVersion = "706904!07/24/2020 08:00:32"
};

while (true)
{
  var response = (RetrieveEntityChangesResponse)_serviceProxy.Execute(request);

  if (response != null && 
      response.EntityChanges != null &&   response.EntityChanges.Changes != null)
  {
    foreach (var record in response.EntityChanges.Changes)
    {
      // Read NewOrUpdatedItem in to 'createdorUpdateRecords' list. 
    if (record is NewOrUpdatedItem)
    {
     createdorUpdateRecords.Add((record as NewOrUpdatedItem).NewOrUpdatedEntity);
    }

    // Read RemovedOrDeletedItem in to 'deletedRecords' list.
    if (record is RemovedOrDeletedItem) 
   {        
     deletedRecords.Add((record as RemovedOrDeletedItem).RemovedItem);
       }
      }
   }
    // Display Created or Updated records
    createdorUpdateRecords.ForEach(x => Console.WriteLine("Created or Updated record id:{0}", x.Id));
    // Display Deleted records
    deletedRecords.ForEach(x => Console.WriteLine("Deleted record id:{0}", x.Id));

    // Logic for Pagination
    if (!response.EntityChanges.MoreRecords)
    {
      //Store token for later query
      token = response.EntityChanges.DataToken;
      break;
    }
   // Increment the page number to retrieve the next page.
   request.PageInfo.PageNumber++;
   // Set the paging cookie to the paging cookie returned from current results.
   request.PageInfo.PagingCookie = response.EntityChanges.PagingCookie;
  • Run the code and you would get response as below

๐Ÿ™‚

Categories: CRM Tags: , ,

Azure function and Cosmos DB Table API – Method not found exception

While creating ‘Entities’ in Azure cosmos Table API from Azure function we encountered following exception.

Reason:

  • Azure Function application project was configured using ‘Azure Function v1 (.NET Framework)’.
  • “Microsoft.Azure.Cosmos.Table” nuget package is not compatible with ‘Azure Function v1 (.NET Framework)’.

Fix:

  • Change the Azure Function application project type to .NET Core.

Now lets understand the technical Know-how to configure ‘Azure Table API’ and interact from Azure Function application project.

Configure ‘Azure Table API’

  • Login to your Azure Portal.
  • Create a new ‘Azure Cosmos DB Account’ by selecting ‘API’ as ‘Azure Table’.
  • Once deployment is complete, open the ‘Resource’.
  • Lets create a new table by clicking on ‘New Table’ from ‘Data Explorer’. Provide the ‘Table id’ and click ‘OK’.
  • In ‘Azure Table API’ record/row is referred as ‘Entity’ and every ‘Entity’ must have ‘PartitionKey’ and ‘RowKey’ values.
    • You must include the PartitionKey and RowKey properties in every insert, update, and delete operation.
  • To connect to ‘Azure Table API’ from external applications (i.e., Azure Function,etc..) we need the ‘Connection String’.
  • Go to ‘Connection String’ tab and copy the ‘PRIMARY CONNECTION STRING’.

Create ‘Azure Function Application’ project

  • Open Visual Studio and create a new ‘Azure Functions’ project.
  • Select .NET version either v2 or v3 .NET Core.
  • Rename the ‘Function Name’ to a meaningful one. I set my function name as ‘CosmosTableAPI’.
  • Add ‘Microsoft.Azure.Cosmos.Table’ NuGet package.
  • Following is the code snippet to connect to ‘Azure Table API’ and Insert entities.

public static async Task Run(
[HttpTrigger(AuthorizationLevel.Function, “get”, “post”, Route = null)] HttpRequest req,
ILogger log)
{
CreateTableandAddData().Wait();

}

public static async Task CreateTableandAddData()
{
string tableName = “customers”;

// Create or reference an existing table
CloudTable table = await Common.CreateTableAsync(tableName);

IndividualEntity customer = new IndividualEntity(“Harp”, “Walter”)
{
Email = “Walter@contoso.com”,
PhoneNumber = “425-555-0101”
};

Console.WriteLine(“Insert an Entity.”);
customer = await InsertOrMergeEntityAsync(table, customer);
}

public static async Task<IndividualEntity> InsertOrMergeEntityAsync(CloudTable table, IndividualEntity entity)
{
try
{
// Create the InsertOrReplace table operation
TableOperation insertOrMergeOperation = TableOperation.InsertOrMerge(entity);

// Execute the operation.
TableResult result = await table.ExecuteAsync(insertOrMergeOperation);
IndividualEntity insertedCustomer = result.Result as IndividualEntity;

// Get the request units consumed by the current operation. RequestCharge of a TableResult is only applied to Azure Cosmos DB
if (result.RequestCharge.HasValue)
{
Console.WriteLine(“Request Charge of InsertOrMerge Operation: ” + result.RequestCharge);
}

return insertedCustomer;
}
catch (StorageException e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}

public static async Task<CloudTable> CreateTableAsync(string tableName)
{
string storageConnectionString = “{Connection string copied from Azure Table API in previous section}”;
// Retrieve storage account information from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnectionString);

// Create a table client for interacting with the table service
CloudTableClient tableClient = storageAccount.CreateCloudTableClient(new TableClientConfiguration());
tableClient.TableClientConfiguration.UseRestExecutorForCosmosEndpoint = true;

// Create a table client for interacting with the table service
CloudTable table = tableClient.GetTableReference(tableName);
if (await table.CreateIfNotExistsAsync())
{
Console.WriteLine(“Created Table named: {0}”, tableName);
}
else
{
Console.WriteLine(“Table {0} already exists”, tableName);
}

return table;
}

public class IndividualEntity : TableEntity
{
public IndividualEntity()
{
}

public IndividualEntity(string lastName, string firstName)
{
PartitionKey = lastName;
RowKey = firstName;
}

public string Email { get; set; }
public string PhoneNumber { get; set; }
}

  • Compile and run the project.
  • Visual studio opens up a console (Storage Simulator) as below. Copy the URL.
  • Hit the URL either from browser or Postman tool.
  • You would get following output from the console.
  • Go to Azure Portal and you should see a new entity created

๐Ÿ™‚

Categories: CRM

Embed Canvas App Error – Environment doesn’t have any CDS database

If you encountered following exception while embedding Canvas app in Model Driven App, then this article is for you.

Canv_4

Reason:

  • You must have a CDS database in your environment to configure Embed Canvas App, as the Canvas App metadata gets stored in CDS database.

Fix:

  • Connect to PowerApps Admin Center
  • Open the ‘default’ environment.
  • Click on ‘Add database’.

PAPPS_1

  • Give it sometime to complete the database setup.
  • Once done, you would get ‘Database version’ rendered as below:

Canv_5

  • Retry embedding Canvas app.

Refer article for step by step details to embed Canvas app in Model driven app.

๐Ÿ™‚

 

DataMigrationUtility – Unable to connect to the Dynamics instance

I was getting following error while connecting to dynamics instance using ‘DataMigrationUtility’ comes with SDK.

You don’t have permission to access any of the organizations in the Microsoft Common Data Service region that you specified. If you’re not sure which region your organization resides in, choose “Don’t know” for the CDS region and try again. Otherwise check with your CDS administrator.

 

MigrationUtility_1

 

Fix:

It seems the issue is with ‘DataMigrationUtility’ and following workaround resolved the issue in my case

  • Get the latest version of the ‘DataMigrationUtility’ tool.
  • In the Login screen,
    • Select ‘Online Region’
    • Leave ‘User Name’ and ‘Password’ fields blank

MigrationUtility_2

  • Click ‘Login’
  • Tool will prompt you the credentials in a new popup.
  • Enter your credentials and ‘Connect’

Notes:

  • I tried by unchecking ‘Show Advanced’ option but had no luck.
  • Also, I could not connect when I chose “Don’t Know” option in ‘Online Region’ field and left the ‘User Name’ and ‘Password’ fields blank.
  • The exact combination specified in ‘Fix’ worked for me.
  • After the workaround, I could connect to the Dynamics instance from ‘DataMigrationUtility’ normally from next time.

๐Ÿ™‚

StyleCop error – The parameter is incorrect

Other day, I encountered following error while triggering ‘Run StyleCop‘ from Visual Studio.

StyleCop_Error

Reason & Fix:

  • In my case, I cloned a DevOps branch and ran the ‘Run StyleCop’ without building the code.
  • Since there were no executable’s, StyleCop thrown error ‘The parameter is incorrect‘.
  • Build the solution and ‘Run StyleCop’ should fix the issue.

StyleCop_Error_1

๐Ÿ™‚

PL-900: Microsoft Power Platform Fundamentals – Prep Notes

February 25, 2020 Leave a comment

There was an ask form a Power Platform beginner in my blog about the preparation for the PL-900 exam.

As I attained the certification recently, I am going to share the topics I covered for my preparation in this article.

***This article is strictly for guidance purpose and by no means I intend to post the questions from the exam***

Areas I covered during my preparation:

  • Power Platform Environments
    • Types of Environments
    • Use of Default Environment
    • Built-in Roles – Admin Role, Maker Role
  • Canvas Apps vs Model Driven Apps
    • Understand that ‘Model Driven Apps’ can only consume CDS data.
    • You need Canvas app for other types of data.
  • Data Connectors
    • Tabular data and Functional-based data connectors
  • Power Automate Flow
    • How you post CDS data to SAP/Saleforce/etc
    • Integrate Flow with Canvas and Model Driven Apps.
    • How to test the Flow.
  • Power Apps Portals
    • Understand the basics of Web Page, Web Role, Web Templates.
  • AI Builder
    • Understand what you can achieve with the 4 models.

PL900_1

  • Power BI
    • Power BI Desktop vs Power BI Service : What can and can’t be done with each.
    • Data Modeling.
    • How to create and share Dashboards.

If any further questions or need guidance feel free to leave a comment.

๐Ÿ™‚

 

Categories: CRM, PowerApps Tags:

Power Platform – Pass external API collection from Power Automate to Canvas App

February 9, 2020 1 comment

In this article, lets see how to pass an external API’s json collection from Power Automate(Formerly ‘Microsoft Flow’) to a Canvas application.

For this example, I am going to use ESRIย  mapping service API, in my Power Automate Flow. Refer here to know more about ESRI. You may also use other APIs as per your convenience.

ESRI API is url based and returns the ‘Address Suggestions’ in json format.

Flow_Json_3

Before we start lets make sure to meet all the prerequisites.

Prerequisites:

  • Subscribe to 30 days trial and get Office 365 account.
  • Connect to Power Automate portal using Office 365 account to build Power Automate flow.
  • Connect to Power Apps maker portal using Office 365 account to build the Canvas App.
  • ESRI api URL – Refer my previous article on the usage of ESRI or you can simply use this url

Once all prerequisites are met, here is the high level design:

  • Create a new Power Automate flow and call ESRI API to fetch Address Suggestions.
    • You can take any open API which returns json collection.
  • Parse the ESRI API Response to json collection.
  • Create a new Canvas app
  • Trigger Power automate flow from Canvas App and read the collection.

Lets get started.

Steps to configure the Power Automate Flow:

  • Connect to Power Automate portal
  • Create a new ‘Instant flow’ by selecting the right Environment.

Flow_Json_1

  • Provide a nameย ‘GetAddressSuggestions’ย and select trigger as ‘PowerApps’

Flow_Json_2

  • To call ESRI api, add a new ‘HTTP’ action.

Flow_Json_4

  • Choose Method as ‘GET’ and in URI paste the ESRI url as mentioned in prerequisite section.

Flow_Json_5

  • Next, we need to parse the response from ESRI api. As the ESRI results would be in json format, add ‘Parse JSON’ action.

Flow_Json_6

  • In ‘Parse JSON’ action,
    • set Content as ‘Body’ from HTTP action.
    • Next we need to provide the json schema of ESRI response. To do that, click on ‘Generate from sample’.
    • Flow_Json_9
    • Now copy the response from ESRI API (Copy the browser’s output using ctrl+a+copy)
    • Flow_Json_7
    • Paste in the ‘Insert a sample JSON Payload’ window and click ‘Done’.
    • Flow_Json_8
    • If no warnings, your ‘Parse JSON’ pane should look as below.
    • Flow_Json_10
  • As we completed the ESRI response capture and parsing to json, now we need to pass the captured json response to Power App.
  • To pass the json response to Power App, add a new ‘Response’ action.

Flow_Json_11

  • In the ‘Response’ pane,
    • Set the ‘Body’ to ‘Body’ from ‘Parse JSON’.
    • Flow_Json_12
    • Expand ‘Show advanced options’ and click on ‘Generate from sample’.
    • Copy the response from ESRI API and paste in the ‘Insert a sample JSON Payload’ window and click ‘Done’. (Same step like we did in ‘Parse JSON’ pane).
    • ‘Response’ pane should look as below with no warnings.
    • Flow_Json_13
  • Run the flow and make sure it ran successfully.

Flow_Json_18

Steps to configure the Canvas App:

As we completed the Power Auto Flow, now its time to consume the Power Automate flow response from Canvas App by following the steps below

  • Connect to Power Apps maker portal using the same Office 365 account.
  • Create a new Canvas App.
    • Note: Make sure the same Environment used for Power Automate is selected.
  • Add a new button.
  • Select the button, from the ribbon menu, click on ‘Action -> Power Automate’.
  • From the pane select the ‘GetAddressSuggestions’ Power app flow.

Flow_Json_14

  • After few seconds, you would see the Run() command auto populated as below.

Flow_Json_15

  • As we going to get json collection from flow, lets capture the json collection to a collection variable ‘collAddress’ using ‘ClearCollect()’. Refer article to know more about ‘ClearCollect()’.
  • With the ClearCollect() and Run() functions, the final ‘OnSelect’ statement should look as below.

Flow_Json_16

  • Lets run the App and click on the button. Application takes couple of minutes to complete the run.

Flow_Json_17

  • Post run, check the collection returned from flow by going to ‘File -> Collections’ tab.

Flow_Json_19

  • You can also add a ‘Data table’ control and display the results returned from flow as below

Flow_Json_20

๐Ÿ™‚

 

Power Platform – Pass json collection from Canvas App to Power Automate

February 9, 2020 1 comment

Refer my previous article for steps to pass json collection from Power Automate to Canvas App.

Now lets see how to pass json collection from a Canvas app to Power Automate flow.

Before we start lets make sure to meet all the prerequisites.

Prerequisites:

  • Subscribe to 30 days trial and get Office 365 account.
  • Connect to Power Automate portal using Office 365 account to build Power Automate flow.
  • Connect to Power Apps maker portal using Office 365 account to build the Canvas App.

Once all prerequisites are met, here is the high level design:

  • Create a new Canvas app.
  • Prepare a json collection.
  • Create a new Power Automate flow.
  • Trigger Power automate flow from Canvas App by passing the collection as parameter.
  • Read the json collection from Power Automate flow.
  • Create ‘Contact’ records in CDS.

Lets get started.

Steps to configure Canvas App:

  • Connect to Power Apps maker portal using the Office 365 account.
  • Create a new Canvas App.
    • Note: Make sure the same Environment used for Power Automate is selected.
  • Add a new button ‘Generate Sample Collection’.
  • On button’s ‘OnSelect’ prepare a json collection using ‘ClearCollect’ function as below.

Flow_Json_21

  • Once your run the App and click ‘Generate Sample Collection’ button, you should see the collection from ‘File -> Collections’ tab.

Flow_Json_22

Steps to configure the Power Automate Flow:

  • Connect to Power Automate portal using the same office 365 credentials used to connect to Power App portal.
  • Create a new ‘Instant flow’ by selecting the right Environment.

Flow_Json_1

  • Provide a name ‘GetCollectionFromPowerApp’ and select trigger as ‘PowerApps’.
  • Now, we need to read the collection passed from Canvas app to a variable. Lets add a new ‘Initialize Variable’ action.
  • In the ‘Initialize Variable’ action,
    • Provide variable name by setting ‘Name’ field. I set it as ‘collPersons’.
    • Type: String
    • Select ‘Value’ and click on ‘Ask in PowerApps’. A auto generated name would appear and select the same.
    • Flow_Json_25
  • Once we receive the collection and stored in a string variable ‘collPersons’, next we need to parse in to json collection.
  • Add a new ‘Parse JSON’ action.
  • In ‘Parse JSON’ action,
    • set Content as ‘collPersons’ from ‘Initialize Variable’ action.
    • Next we need to provide the json schema coming from Canvas app. To do that, click on ‘Generate from sample’.
    • Copy the json collection formed in Canvas app using ClearCollect function and paste as below.
    • Flow_Json_27
    • If no warnings, your ‘Parse JSON’ pane should look as below.
    • Flow_Json_28
  • Now we got json collection parsed and we are good start final step, creating records to CDS.
  • Add ‘Create a new record’ CDS action and select the ‘Environment’ and ‘Entity Name’

Flow_Json_33

  • Set the attributes (Last Name and Email) from ‘Parse JSON’ action.

Flow_Json_34

  • Once you set variables in ‘Create a new record’ CDS action, flow would automatically wrap the action in ‘Apply to each’.

Flow_Json_29

Trigger the Power automate flow from Canvas App:

Now we have Power automate flow with the logic to read and parse the json collection and add them to CDS as Contact records, lets trigger the flow from Canvas App.

  • Open the Canvas app created in above sections.
  • Add a new button ‘Share Collection To Flow’
  • Select the button, from the ribbon menu, click on ‘Action -> Power Automate’.
  • From the pane select the ‘GetCollectionFromPowerApp’ Power app flow.
  • After few seconds, you would see the Run() command auto populated. Now pass the ‘collPersonalInfo’ collection wrapped in JSON() function.
    • JSON() function is required to convert the ‘collPersonalInfo’ collection to JSON string as the Power Automate flow expects ‘String’ variable as first action.
  • Finally ‘OnSelect’ should look as below.

 

Flow_Json_30

  • Now run the App and click ‘Share Collection To Flow’ button, post run, go to Power automate flow’s history and you should see it ran successfully.

Flow_Json_31

๐Ÿ™‚

Categories: CRM, PowerApps Tags: ,