Power Platform Admin Analytics

September 27, 2020 Leave a comment

In previous Dynamics versions, with the help of Organization Insights, available as a preferred solution from AppSource, we can get the organization level insights such as No of Active Users, API Calls, etc.

Now  these analytics can be viewed right from the Power Platform Admin Center with no need of additional solutions.

Currently there are 3 types of Analytics are available.

Who can view the Analytics reports:

Admins with the following roles and a license can view the analytics:

  • Environment Admin – can view reports for the environments that the admin has access to.
  • Power Platform admin – can view reports for all environments.
  • Dynamics 365 admin – can view reports for all environments.
  • Microsoft 365 Global admin – can view reports for all environments.

Refer this article for more details

🙂

Categories: CRM, PowerApps Tags: ,

Microsoft Teams and PowerApps – Project Oakdale

September 24, 2020 1 comment

Project Oakdale, a new built-in low-code data platform for Teams that provides enterprise relational datastores with rich data types to Teams users, is now in public preview.

Solutions built with Power Platform can be easily published to the Teams app store and can be used off the shelf or customized for specific needs.

Lets see how to embed a Power app in Teams.

  • Connect to Office 365 portal and check if you have ‘Teams’ enabled.
  • If no Teams available, go to Teams site and sign in with Office 365 credentials.
  • Go to Teams and click on Apps->Power Apps and click ‘Add’.
  • A ‘Power Apps’ tab adds to the Teams.
  • From ‘Power Apps’ , either create a new App by clicking ‘Create an app’ or pick any of the readily available apps.
  • Once you choose existing App, select a Team’s Channel where this App needs to be available. I chose ‘Hello World’ team’s ‘General’ channel.
  • App takes few minutes to complete the setup.
  • Grant the required permissions.
  • App loads as below and ready to be used.

Notes:

  • Project Oakdale environments are automatically created for the selected team when you create a Power app in Teams for the first time or install a Power Apps app from the app catalog. See About the Project Oakdale environment.

🙂

Categories: PowerApps Tags: ,

[Code Snippet] Using Microsoft.Pfe.Xrm library

September 20, 2020 2 comments

Pfe.Xrm library allows us to submit a bunch of XRM requests as batches (using ExecuteMultipleRequest) and execute them in parallel.

In this article, lets see how to use Microsoft.Pfe.Xrm library with a simple ‘Retrieve Request’.

What is Microsoft.Pfe.Xrm library?

  • Contains a set of common components for building solutions using the Dynamics 365 SDK and is developed by Microsoft Premier Field Engineering (PFE) and core engineering teams.

Download Microsoft.Pfe.Xrm

  • Use ‘Tools -> Nuget package manager’ from your Visual Studio to download the Nuget package.

Building blocks of Microsoft.Pfe.Xrm library

‘Microsoft.Pfe.Library’ allows us to trigger any ‘OrganizationRequest’ in batches (i.e.,ExecuteMultipleRequest) parallelly. Lets understand the building blocks of Pfe library.

  • List<OrganizationRequest>
    • As a first step prepare your ‘RetrieveMultipleRequest’ objects and add them to a List.
  • AsBatches() method
    • Call Pfe libraries AsBatches() method by passing the List prepared in previous step.
    • ‘AsBatches()’ returns IDictionary<string, ExecuteMultipleRequest> object.
  • OrganizationServiceManager
    • Trigger ‘Microsoft.Pfe.Xrm’ OrganizationServiceManager() method to create connection object.
    • There are multiple ‘OrganizationServiceManager’ overload methods.
    • In this article, I am using the following method.
  • ParallelProxy.Execute
    • This is the final method which accepts the ‘IDictionary<string, ExecuteMultipleRequest>’ object returned by AsBatches() method which we discussed in previous sections.
    • ParallelProxy.Execute also requires OrganizationServiceManager which we created in previous step.

IDictionary<string, ExecuteMultipleResponse> batchResponses = pfeXrmConnection.ParallelProxy.Execute(batchesOrgRequest);

Code Snippet

Now lets put all the building blocks together. It becomes as the following code snippet.

    public void RetrieveUsingPFE()
    {
        // Retrieve 'Account' and 'Contact'. You can prepare your list of entities.
        var listEntities=new List<>{ "account", "contact" };
        var batchRequests = new List<OrganizationRequest>();            

        int pageNumber = 1;
        string pagingCookie = null;

        // Filter for Active records. You can prepare your required filters as below.
        var filterActive = new FilterExpression(LogicalOperator.And);
        var condActive = new ConditionExpression("statecode", ConditionOperator.Equal, 0);
        filterActive.AddCondition(condActive);

        try
        {
            foreach (var entity in listEntities)
            {
                // Prepare 'Query Expression'
                var query = new QueryExpression(entity)
                {
                    ColumnSet = new ColumnSet(true),
                    PageInfo = new PagingInfo()
                };
                query.PageInfo.Count = 5000;
                query.PageInfo.PageNumber = pageNumber;
                query.PageInfo.PagingCookie = pagingCookie;

                query.Criteria.AddFilter(filterActive);

                var request = new RetrieveMultipleRequest
                {
                    Query = query
                };

                batchRequests.Add(request);
            }

            do
            {
                // Trigger Pfe.Execute
                var batchResponses = TriggerRequests(batchRequests);
                batchRequests = new List<OrganizationRequest>();
                pageNumber++;
                foreach (var responseItem in batchResponses)
                {
                    if (responseItem.EntityCollection.Entities.Count > 0)
                    {                            
                        var currEntityName = responseItem.EntityCollection.Entities[0].LogicalName;

                        foreach (var record in responseItem.EntityCollection.Entities)
                        {
                            // 'record' is the Retreieved record
                            // Perform your desired operations
                        }
                    }

                    if (responseItem.EntityCollection.MoreRecords)
                    {
                        var query = new QueryExpression(responseItem.EntityCollection.EntityName)
                        {
                            ColumnSet = new ColumnSet(true)
                        };

                        query.Criteria.AddFilter(filterActive);

                        var request = new RetrieveMultipleRequest();
                        query.PageInfo = new PagingInfo
                        {
                            Count = 5000,
                            PageNumber = pageNumber,
                            PagingCookie = responseItem.EntityCollection.PagingCookie
                        };

                        request.Query = query;
                        batchRequests.Add(request);
                    }
                }
            } while (batchRequests.Count > 0);
        }
        catch (Exception ex)
        {
            Console.WriteLine("Error occurred, error message is {0}", ex.Message);
            throw ex;
        }            
    }

    public List<RetrieveMultipleResponse> TriggerRequests(List<OrganizationRequest> batchRequests)
    {
        var recordsPerRequest = 1; // Set your desired value.
        var retrieveRequestBatches = batchRequests.AsBatches(recordsPerRequest);
        var batchResponses = ExecuteParallelProxy(retrieveRequestBatches);            
        var listResponses = new List<RetrieveMultipleResponse>();

        foreach (var key in batchResponses.Keys)
        {
            foreach (var result in batchResponses[key].Responses)
            {
                if (retrieveRequestBatches[key].Requests[result.RequestIndex] is RetrieveMultipleRequest)
                {
                    var originalRequest = (RetrieveMultipleRequest)retrieveRequestBatches[key].Requests[result.RequestIndex];
                    // Capture failed records
                    if (result.Fault != null)
                    {
                        Console.WriteLine($" Exception : {result.Fault.Message}");
                    }
                    else if (result.Response != null && result.Response is RetrieveMultipleResponse) // Capture success records
                    {
                        var responseItem = (RetrieveMultipleResponse)result.Response;
                        listResponses.Add(responseItem);
                    }
                }                    
            }
        }

        return listResponses;
    }

    private IDictionary<string, ExecuteMultipleResponse> ExecuteParallelProxy(IDictionary<string, ExecuteMultipleRequest> retrieveRequestBatches)
    {            
        IDictionary<string, ExecuteMultipleResponse> batchResponses = null;
        try
        {
            batchResponses = CRMPfeConnection.ParallelProxy.Execute<ExecuteMultipleRequest, ExecuteMultipleResponse>(retrieveRequestBatches);                
        }
        catch (AggregateException ae)
        {
            foreach (var ex in ae.InnerExceptions)
            {
                Console.WriteLine($"Aggregate Error in ExecuteParallelProxy : {ex.Message.ToString()}");
            }
        }
        catch (Exception ex)
        {
            Console.WriteLine($"Error in ExecuteParallelProxy : {ex.Message.ToString()}");
        }

        return batchResponses;
    }

    public static OrganizationServiceManager CRMPfeConnection
    {
        get
        {
            if (crmPfeConnection == null)
            {
                var UserName = ConfigurationManager.AppSettings["UserNameSrc"].ToString();
                var Password = ConfigurationManager.AppSettings["PasswordSrc"].ToString();
                var OrganizationUri = ConfigurationManager.AppSettings["OrganizationUriSrc"].ToString();
                ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
          crmPfeConnection = new OrganizationServiceManager(XrmServiceUriFactory.CreateOrganizationServiceUri(OrganizationUri), UserName, Password);
            }
            return crmPfeConnection;
        }
    }
  • Similar to ‘RetrieveRequest’, you can trigger all kinds of requests.

🙂

Power Apps | Custom connector | User does not have an entitlement to use PowerApps

September 2, 2020 Leave a comment

From the Dynamics trial instance, while registering a Custom Connector, encountered following issue.

Reason:

  • Error message states clearly, that the User who is trying to register ‘Custom Connector’ does not have ‘Power Apps’ license.
  • By default when you subscribe for trial , Dynamics only grants the ‘Dynamics 365 Customer Engagement Applications’ license which does not have ‘Power Apps’ option.
  • To check the User license, connect to Microsoft 365 Admin Center
  • Go to user’s ‘Manage product licenses’ tab.

Fix:

  • Grant the user Power Apps license to the user.
  • One option is by subscribing for the ‘Dynamics 365 Customer Engagement Plan‘ license by following below steps.
  • Connect to Microsoft 365 Admin Center
  • Go to ‘Billing -> Purchasing Services’.
  • Select ‘Dynamics 365 Customer Engagement Plan Trial’ and click on ‘Get free trial’.
  • You would get following confirmation page up on procurement.
  • Now open the User, and grant the ‘Dynamics 365 Customer Engagement Plan’ license.
    • Make sure ‘PowerApps for Dynamics 365’ is enabled.
  • You should be able to configure Custom Connector now.
  • Refer Power Apps license guide for more details.

🙂

Categories: PowerApps Tags: ,

SQL SERVER | Merge Statement | The insert column cannot contain multi-part identifiers

Other day while executing SQL ‘MERGE’ statement on 2 tables, I ran in to following error.

The insert column list used in the MERGE statement cannot contain multi-part identifiers

Before we understand the reason and fix for this error, lets get the basics of ‘MERGE’ statement.

‘MERGE’ statement:

  • ‘MERGE’ statement can be used, if you have 2 SQL tables and to synchronize both two tables by inserting, updating, or deleting rows in one table based on differences found in the other table.
  • ‘MERGE’ runs insert, update, or delete operations on a target table from the results of a join with a source table.

Step by Step using ‘MERGE’:

Lets understand how MERGE works with an example.

  • Create 2 tables, ‘Cust_Source’ and ‘Cust_Trg’. Column ‘CID’ acts as joining column between 2 tables.
‘Source’ and ‘Target’ tables
  • Lets use MERGE and sync ‘Cust_Trg’ using ‘Cust_Source’ table by covering following points:
    • Update the matching records in ‘Cust_Trg’ using the ‘Cust_Source’ rows.
    • Create (i.e., INSERT) the unavailable records in ‘Cust_Trg’ from ‘Cust_Source’.
    • Delete the excessive records from ‘Cust_Trg’ by comparing with ‘Cust_Source’ rows.
‘MERGE’ query
Query:

MERGE [dbo].[Cust_Trg] AS Trg
USING [dbo].[Cust_Source] AS Src
ON Trg.CID=Src.CID
WHEN MATCHED — Update the Target record fields from Source values.
THEN UPDATE SET Trg.Salary=Src.Salary,Trg.WorkExp=Src.WorkExp
WHEN NOT MATCHED BY TARGET — INSERT the record in Target as its not part of Target yet.
THEN INSERT (CID, [Name], Salary, WorkExp)
VALUES (CID, [Name], Salary, WorkExp)
WHEN NOT MATCHED BY SOURCE — Delete the record from Target as its not part of Source.
THEN DELETE

OUTPUT $action, –INSERT/UPDATE/DELETE
DELETED.CID AS TargetCID,
DELETED.Name AS TargetName,
DELETED.Salary AS TargetSalary,
DELETED.WorkExp AS TargetWorkExp,
INSERTED.CID AS SourceCID,
INSERTED.Name AS SourceName,
INSERTED.Salary AS SourceSalary,
INSERTED.WorkExp AS SourceWorkExp;

SELECT @@ROWCOUNT ‘No ofAffected rows’

  • Execute the query and you would get results as below.

Issue encountered with MERGE:

  • When I ran my initial MERGE query, ran in to following exception.
Don’t use Alias in ‘INSERT’
  • Issue was due to the usage of ‘Alias’ in INSERT statement (i.e., Trg.Salary, etc.)
  • Remove ‘Alias’ and rerun the query should solve the issue.
Notes:
  • MERGE statement works best when the two tables have a complex mixture of matching characteristics. For example, inserting a row if it doesn’t exist, or updating a row if it matches.
  • When simply updating one table based on the rows of another table, improve the performance and scalability with basic INSERT, UPDATE, and DELETE statements.

🙂

Categories: SQL Tags: , ,

Working with JSON data in SQL Server

July 26, 2020 1 comment

In this article, lets see how to work with JSON data in SQL Server using SSMS. Following features will be covered in this article.

  • Format SQL table data in JSON format (Using FOR JSON PATH clause)
  • Store JSON data in SQL table
  • Query JSON data in SQL server (Using JSON_VALUE and JSON_QUERY functions)
  • Update JSON data in SQL table (Using JSON_MODIFY function)
  • Convert JSON data in to SQL table format (Using OPENJSON function)
  • Validate JSON data format (Using ISJSON function)

Format SQL table data in JSON format:

Lets see how to format SQL query tabular results in JSON format using FOR JSON clause.

  • Create a simple ‘Employee’ table in SQL server.
  • When you query ‘Employee’ table you would get the result in tabular format.
  • Lets convert the result set in to JSON format using FOR JSON clause.
-- JSON PATH
select *
from Employee
FOR JSON PATH
  • Use WITHOUT_ARRAY_WRAPPER to get a single JSON object instead of an array. Use this option if the result of query is single object.
-- JSON PATH and WITHOUT_ARRAY_WRAPPER
select *
from Employee
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER
  • We can also form inline JSON object format. Lets see how to show ‘Address’ as a separate object.
-- JSON PATH, WITHOUT_ARRAY_WRAPPER and 'Address' as seperate object
select
ID,Name,
Street1 as [Address.Street1], Street2 as [Address.Street2],City as [Address.City], ZipCode as [Address.ZipCode]
from Employee
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER

Store JSON data in SQL table:

  • JSON is a textual format that can be used like any other string type in SQL Database. JSON data can be stored as a standard NVARCHAR.
  • In the example below SQL table has ‘JSONData’ column which is NVARCHAR(MAX).

Query JSON data in SQL server:

  • Lets query the JSON data using JSON_VALUE and JSON_QUERY.
  • JSON_VALUE:
    • JSON_VALUE function extracts a value from JSON text stored in the SQL column.
    • The extracted value can be used in any part of SQL query.
  • JSON_QUERY:
    • JSON_QUERY function extracts complex sub-object such as arrays or objects that are placed in JSON text.
    • Example ‘Tags’ column in above SQL Table.
Query records whose Color is ‘Yellow’

Update JSON data in SQL table:

  • JSON_MODIFY function can be used to update the JSON text without re-parsing the entire structure.
  • Below is the query to update the ‘Price’ of all ‘Yellow’ color products to 1000.
Update Price for records where Color is ‘Yellow’
Update Products
set JSONData = JSON_MODIFY(JSONData,'$.Price',1000)
where
JSON_VALUE(JSONData,'$.Color') = 'Yellow'
  • Post query execution, ‘Price’ updated to 1000 as shown below.

Convert JSON data in to SQL table format

  • Using OPENJSON function JSON data can be transformed in to SQL table format.
  • OPENJSON
    • Its a table-value function that parses JSON text, locates an array of JSON objects, iterates through the elements of the array, and returns one row in the output result for each element of the array.
Convert JSON data to Table format

Validate JSON data format

  • Since JSON is stored in a standard text, there are no guarantees that the values stored in text columns are properly formatted using ISJSON function.
  • ISJSON function returns the value 1 if the data properly formatted in JSON.
ISJSON(SQL_Cell) > 0

🙂

Categories: SQL Tags: ,

Power Apps | CDS | Track deleted records using ‘Change Tracking’

For unversed, Change Tracking in CDS allows data to be synchronized by detecting what data has changed since the data was initially extracted or last synchronized.

Its useful in Integration scenarios, where you have to keep data between CDS and external systems in sync.

Assume that your customer has a LOB application along with CDS and you want sync the LOB application (i.e., When record gets Created/Updated/Deleted in CDS), this can be achieved using ‘Change Tracking’ feature.

In this article, lets understand how ‘Change Tracking’ works in CDS using ‘Organization Service’, to:

  • Get the Created or Updated records from last sync.
  • Get Deleted records from last sync.

Enabling Change Tracking for an Entity:

  • To use ‘Change Tracking’ make sure that the change tracking feature is enabled for that entity.
  • This feature can be enabled by using the customization user interface.
Enable Change Tracking

‘Change Tracking’ using the Organization Service:

  • When change tracking is enabled for an entity, RetrieveEntityChangesRequest message can be used to retrieve the changes for that entity.
  • In my example, I am using OOB ‘Account’ entity which has 10 records.
  • When ‘RetrieveEntityChangesRequest’ triggered for the first time, we would get all 10 records in ‘Changes’ object and a ‘Data Token’.
  • Copy the ‘Data Token’.
  • Now lets delete an Account and add a new Account in CDS.
  • Trigger ‘RetrieveEntityChangesRequest’ again by passing the copied ‘Data Token’.
    • Passing ‘Data Token’ in ‘RetrieveEntityChangesRequest’, fetches only data for those changes that occurred since that version will be returned.
  • This time we would get 2 records in ‘Changes’ object.
    • 1 NewOrUpdatedItem
    • 1 RemovedOrDeletedItem.
  • ‘NewOrUpdatedItem’ is object of type ‘Entity’ with all the properties. ‘RemovedOrDeletedItem’ is object of type ‘EntityReference’ contain only GUID of deleted record.
  • Following is the code snippet to track changes and display in console.
// Created or Updated records will be returned as 'Entity' object.
var createdorUpdateRecords = new List<Entity>();
// Deleted records will be returned as 'Entity Reference' object.
var deletedRecords = new List<EntityReference>();

// Retrieve records by using Change Tracking feature.
var request = new RetrieveEntityChangesRequest
{
    EntityName = "account",
    Columns = new ColumnSet("name"),
    PageInfo = new PagingInfo() { 
       Count = 5000, 
       PageNumber = 1, 
       ReturnTotalRecordCount = false 
    },
    // DataVersion is an optional attribute
    // If not passed system returns all the records. 
    // Dont pass this attribute when calling for the first time.
     DataVersion = "706904!07/24/2020 08:00:32"
};

while (true)
{
  var response = (RetrieveEntityChangesResponse)_serviceProxy.Execute(request);

  if (response != null && 
      response.EntityChanges != null &&   response.EntityChanges.Changes != null)
  {
    foreach (var record in response.EntityChanges.Changes)
    {
      // Read NewOrUpdatedItem in to 'createdorUpdateRecords' list. 
    if (record is NewOrUpdatedItem)
    {
     createdorUpdateRecords.Add((record as NewOrUpdatedItem).NewOrUpdatedEntity);
    }

    // Read RemovedOrDeletedItem in to 'deletedRecords' list.
    if (record is RemovedOrDeletedItem) 
   {        
     deletedRecords.Add((record as RemovedOrDeletedItem).RemovedItem);
       }
      }
   }
    // Display Created or Updated records
    createdorUpdateRecords.ForEach(x => Console.WriteLine("Created or Updated record id:{0}", x.Id));
    // Display Deleted records
    deletedRecords.ForEach(x => Console.WriteLine("Deleted record id:{0}", x.Id));

    // Logic for Pagination
    if (!response.EntityChanges.MoreRecords)
    {
      //Store token for later query
      token = response.EntityChanges.DataToken;
      break;
    }
   // Increment the page number to retrieve the next page.
   request.PageInfo.PageNumber++;
   // Set the paging cookie to the paging cookie returned from current results.
   request.PageInfo.PagingCookie = response.EntityChanges.PagingCookie;
  • Run the code and you would get response as below

🙂

Categories: CRM Tags: , ,

Power apps – Model driven app – Excel import not working

Other day, when we copied one of the Environments to another and tried to import csv file using ‘Excel’ import feature, file stuck at ‘Submitted’ state.

Reason:

  • Post copy, Environment’s ‘Administration mode’ gets enabled by default, which prevents all background (Async) operations.

Fix:

  • Turn off the ‘Administration mode’

🙂

Azure function and Cosmos DB Table API – Method not found exception

While creating ‘Entities’ in Azure cosmos Table API from Azure function we encountered following exception.

Reason:

  • Azure Function application project was configured using ‘Azure Function v1 (.NET Framework)’.
  • “Microsoft.Azure.Cosmos.Table” nuget package is not compatible with ‘Azure Function v1 (.NET Framework)’.

Fix:

  • Change the Azure Function application project type to .NET Core.

Now lets understand the technical Know-how to configure ‘Azure Table API’ and interact from Azure Function application project.

Configure ‘Azure Table API’

  • Login to your Azure Portal.
  • Create a new ‘Azure Cosmos DB Account’ by selecting ‘API’ as ‘Azure Table’.
  • Once deployment is complete, open the ‘Resource’.
  • Lets create a new table by clicking on ‘New Table’ from ‘Data Explorer’. Provide the ‘Table id’ and click ‘OK’.
  • In ‘Azure Table API’ record/row is referred as ‘Entity’ and every ‘Entity’ must have ‘PartitionKey’ and ‘RowKey’ values.
    • You must include the PartitionKey and RowKey properties in every insert, update, and delete operation.
  • To connect to ‘Azure Table API’ from external applications (i.e., Azure Function,etc..) we need the ‘Connection String’.
  • Go to ‘Connection String’ tab and copy the ‘PRIMARY CONNECTION STRING’.

Create ‘Azure Function Application’ project

  • Open Visual Studio and create a new ‘Azure Functions’ project.
  • Select .NET version either v2 or v3 .NET Core.
  • Rename the ‘Function Name’ to a meaningful one. I set my function name as ‘CosmosTableAPI’.
  • Add ‘Microsoft.Azure.Cosmos.Table’ NuGet package.
  • Following is the code snippet to connect to ‘Azure Table API’ and Insert entities.

public static async Task Run(
[HttpTrigger(AuthorizationLevel.Function, “get”, “post”, Route = null)] HttpRequest req,
ILogger log)
{
CreateTableandAddData().Wait();

}

public static async Task CreateTableandAddData()
{
string tableName = “customers”;

// Create or reference an existing table
CloudTable table = await Common.CreateTableAsync(tableName);

IndividualEntity customer = new IndividualEntity(“Harp”, “Walter”)
{
Email = “Walter@contoso.com”,
PhoneNumber = “425-555-0101”
};

Console.WriteLine(“Insert an Entity.”);
customer = await InsertOrMergeEntityAsync(table, customer);
}

public static async Task<IndividualEntity> InsertOrMergeEntityAsync(CloudTable table, IndividualEntity entity)
{
try
{
// Create the InsertOrReplace table operation
TableOperation insertOrMergeOperation = TableOperation.InsertOrMerge(entity);

// Execute the operation.
TableResult result = await table.ExecuteAsync(insertOrMergeOperation);
IndividualEntity insertedCustomer = result.Result as IndividualEntity;

// Get the request units consumed by the current operation. RequestCharge of a TableResult is only applied to Azure Cosmos DB
if (result.RequestCharge.HasValue)
{
Console.WriteLine(“Request Charge of InsertOrMerge Operation: ” + result.RequestCharge);
}

return insertedCustomer;
}
catch (StorageException e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}

public static async Task<CloudTable> CreateTableAsync(string tableName)
{
string storageConnectionString = “{Connection string copied from Azure Table API in previous section}”;
// Retrieve storage account information from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnectionString);

// Create a table client for interacting with the table service
CloudTableClient tableClient = storageAccount.CreateCloudTableClient(new TableClientConfiguration());
tableClient.TableClientConfiguration.UseRestExecutorForCosmosEndpoint = true;

// Create a table client for interacting with the table service
CloudTable table = tableClient.GetTableReference(tableName);
if (await table.CreateIfNotExistsAsync())
{
Console.WriteLine(“Created Table named: {0}”, tableName);
}
else
{
Console.WriteLine(“Table {0} already exists”, tableName);
}

return table;
}

public class IndividualEntity : TableEntity
{
public IndividualEntity()
{
}

public IndividualEntity(string lastName, string firstName)
{
PartitionKey = lastName;
RowKey = firstName;
}

public string Email { get; set; }
public string PhoneNumber { get; set; }
}

  • Compile and run the project.
  • Visual studio opens up a console (Storage Simulator) as below. Copy the URL.
  • Hit the URL either from browser or Postman tool.
  • You would get following output from the console.
  • Go to Azure Portal and you should see a new entity created

🙂

Categories: CRM

Power Apps – Component library

Components

In Power Apps, Components are reusable building blocks for canvas apps so that app makers can create custom controls to use inside an app.

  • Components scope is local, which means you can create a Component in an App and can reuse with in the screens of that particular App.
Adding ‘Component’ from ‘Canvas’ App.

Component Library

Component Libraries (Currently in preview) are the recommended way to reuse components across apps.

Unlike the ‘Component’ whose scope is limited with in the App, ‘Component Libraries’ can be reused across the Apps.

In this article, we are going to learn following topics

  • Create a new ‘Component Library’ with ‘Header’ and ‘Footer’ components.
  • Share the ‘Component Library’
  • Use the ‘Component Library’ in Canvas App
  • Modify the ‘Component Library’
  • Update ‘Component Library’ in Canvas App.

Lets get started with Prerequisites.

Prerequisites:

  • PowerApps account. Refer here to get an account.

Once you fulfill the Prerequisites, lets get started with first step.

Create a new ‘Component Library’ with ‘Header’ and ‘Footer’ components:

  • Connect to Power Apps maker portal
  • From the ‘Apps’ tab, click on ‘New component library’.
  • Provide the name and click on ‘Create’.
  • As we know ‘Component Library’ is collection of reusable ‘Components’, In the next screen, a default ‘Component’ would be presented.
  • Rename the default Component to ‘Header’.
  • Add a Label control ‘lblHeaderText’, as shown in the screen.
  • Add a new ‘Custom property’ of type ‘Text’ and name it as ‘Header Text’.
    • A component can receive input values and emit data if you create one or more custom properties.
  • Set the Label controls ‘Text’ property to the ‘Header Text’ property.
    • ‘Header Text’ property will be handy while referring the components in ‘Canvas’ apps.
  • Also set the ‘Header’ components ‘Width’ property to the ‘App.ActiveScreen.Width’.
    • App.ActiveScreen.Width sets the width of Components as per the Canvas app.
  • Lets copy the ‘Header’ component using the ‘Duplicate component’ option, and create ‘Footer’ component.
  • I’ve added a Label and new ‘Custom property’ of type ‘Text’ and name it as ‘Footer Text’, as shown in screen below.
  • Save the ‘Component Library’.
  • In the maker portal ‘Component Library’ would show up.

Share the ‘Component Library’:

Now that we created the ‘Component Library’, we need to share the App.

  • Post the ‘Save’, click on ‘Share’.
  • Add people you would like to share. I’ve shared the Component Library with ‘Everyone’ in my account.

Use the ‘Component Library’ in Canvas App:

As we created the ‘Component Library’ lets put that in use by creating a Canvas app.

  • Create a new Canvas app.
  • To add the ‘Component Library’, click on ‘+ Insert’ and ‘Get more components’.
  • From the list choose the ‘Components’.
    • You would get the ‘Components’ those were shared to you.
  • Added ‘Components’ would be available under ‘Library Components’.
  • Add both ‘Header’ and ‘Footer’ Components to the Canvas app’s screen.
  • Set the ‘HeaderText’ property of the ‘Header’ component. This would immediately changes your Header label text.
  • Similarly, Set the ‘FooterText’ property of the ‘Footer’ component. This would immediately changes your Footer label text.

Modify the ‘Component Library’:

Lets modify the ‘Component Library’ and see how it reflects in the Canvas App.

  • Change the background color of the ‘Header’ component.
  • Save and Publish.
  • Go to ‘Canvas App’ and you would notice banner as below.
  • Click on ‘Review’ and click on ‘Update’ to get the latest Component updates.
  • Canvas apps Header changes as below.

Notes:

  • Sharing a component library works the same way you share a canvas app.
  • When you share a component library, you allow others to reuse the component library.
  • Once shared, others can edit the component library and import components from this shared component library for creating and editing apps.
  • If shared as a co-owner, a user can use, edit, and share a component library but not delete or change the owner.
  • You can’t add existing component libraries to a solution. However, you can create new component libraries for solutions using add component library flow.
  • You can’t access controls in the component from outside of the component.
  • You can’t refer to anything outside of the component from inside the component. The exception is data sources shared between an app and its components

🙂