Power Apps | Microsoft Dataverse

November 17, 2020 2 comments

Microsoft Dataverse

Common Data Service (CDS), the sophisticated and secure backbone that powers Dynamics 365 and Power Platform, has been renamed to Microsoft Dataverse

Some terminology in Microsoft Dataverse has been updated. For example, entity is now ‘Table’ and field is now ‘Column’. Learn more.

Microsoft Dataverse for Teams

Microsoft Dataverse for Teams (formerly known as Project Oakdale), a low code built-in data platform for Teams, is generally available now.

Microsoft Dataverse for Teams follows existing data governance rules established by the Power Platform and enables access control in the Teams Admin Center like any other Teams feature. Within the Teams Admin center, you can allow or block apps created by users at the individual level, group level, or org level.

Refer my article on Project Oakdale to know more.

Power BI Teams App

Licensed Microsoft Power BI users can enjoy the full capabilities of Power BI in Teams with the Power BI Teams App.

Power Automate App for Teams

Power Automate App for Teams lets you automate your Microsoft Teams activities or and connect Microsoft Teams to other apps and services.

Click here to know more on Dataverse.

🙂

Categories: CRM

Canvas Apps | Useful functions

November 15, 2020 Leave a comment

In this article, I am collating the useful formulas which are frequently used in Canvas Apps.

For all functions, I am going to use a Textbox control ‘txtInput’ as reference.

Match

Validates User inputs based on predefined or custom patterns. For example, you can confirm whether the user has entered a valid email address, SSN, etc…

Validate Email:
  • IsMatch(txtInput.Text,Email)
Validate SSN:
  • IsMatch( txtInput.Text, Digit & Digit & Digit & Hyphen & Digit & Digit & Hyphen & Digit & Digit & Digit & Digit )
Validate SSN using Regular Expression (RegEx)
  • IsMatch( txtInput.Text, “\d{3}-\d{2}-\d{4}” )
Check presence of string
  • IsMatch( txtInput.Text, “hello”, Contains & IgnoreCase)
    • Check if ‘hello’ exists in the Text input.

User

Get Current User Email
  • User().Email
Get Current User Fullname
  • User().FullName
Get Profile Image
  • User().Image
    • Add an Image control and set ‘Image’ property with User().Image.

Date functions

Today()
  • Returns the current date as a date/time value.
  • The time portion is always midnight (i.e., 12:00:00). 
Now()
  • Returns the current date and time as a date/time value
IsToday()
  • Checks whether a date/time value is between midnight today and midnight tomorrow. Returns a Boolean (true or false) value.
Weekday()
  • Weekday(Date)
    • Returns between 1-7 and Sunday is ‘1’.
DateDiff()
  • DateDiff(Date1.SelectedDate, Date2.SelectedDate, Days)
    • ‘Date1’ and ‘Date2’ are ‘Date Picker’ controls.

Miscellaneous

  • IsBlank(txtInput)
    • Checks for a blank value or an empty string.
  • Coalesce
    • Evaluates its arguments in order and returns the first value that isn’t blank or an empty string.
    • Use this function to replace a blank value or empty string with a different value but leave non-blank and non-empty string values unchanged
  • GUID
    • GUID() – Returns a new Guid.
      • Set( NewGUID, GUID() )
    • To generate 5 new GUIDs and set to Collection
      • ClearCollect( NewGUIDs, ForAll( Sequence(5), GUID() ) )

🙂

Categories: CRM

[Code Snippet] Set Business process flow (BPF) stage using C#

October 15, 2020 Leave a comment

Assume you have a BPF with 3 Stages on an Entity ‘Employer’. When you create a new ‘Employer’ record, by default ‘Stage-1’ gets set.

What if you have to create the ‘Employer’ record with a different stage.

Lets see how to create a record and set the desired BPF stage with an example entity ‘Employer’.

I’ve an ‘Employer’ entity and a BPF name ‘Employer flow’ with 3 stages named ‘Basic Details’, ‘Address’ and ‘Experience’.

Key point to notice is, when ever you create a BPF a new entity gets created with the given BPF name.

‘Employer flow’ is BPF Entity

Following are the steps to create a ‘Employer’ record and set the BPF stage to ‘Experience’.

  • Query ‘Workflow’ entity to get the ‘BPF ID’ by passing the BPF entity schema name (i.e., crf10_employerflow).
var queryEmployerBPF = new QueryExpression
                {
                    EntityName = "workflow",
                    ColumnSet = new ColumnSet(true),
                    Criteria = new FilterExpression
                    {
                        Conditions =
                        {
                            new ConditionExpression
                            {
                                AttributeName = "uniquename",
                                Operator = ConditionOperator.Equal,
                                Values = { "crf10_employerflow" }
                            }
                        }
                    }
                };
                var retrievedBPF = ConnectionManager.CrmService.RetrieveMultiple(queryEmployerBPF).Entities[0];
                var _bpfId = retrievedBPF.Id;
  • Query Process Stage by passing ‘BPF ID’ fetched in previous step.
var queryPS = new QueryExpression{
EntityName = "processstage",
ColumnSet = new ColumnSet(true),
Criteria = new FilterExpression{
Conditions ={
new ConditionExpression{
AttributeName = "processid",
Operator = ConditionOperator.Equal,
Values={ _bpfId }
}
}
}
};
  • Copy the ‘Stage’ GUID’s which will be used in next steps.
  • Create the ‘Employer’ record, which also creates record in ‘BPF entity’ (i.e., Employer Flow).
            Entity entEmployer = new Entity("crf10_employer");
            entEmployer["crf10_name"] = "BPF Test";
            //entEmployer["processid"] = Guid.Empty;
            var violationID = ConnectionManager.CrmService.Create(entEmployer);
  • Fetch the ‘BPF entity’ (i.e., Employer Flow) record, which auto created in previous step, using ‘RetrieveProcessInstancesRequest’ request.
                var procOpp2Req = new RetrieveProcessInstancesRequest
                {
                    EntityId = violationID,
                    EntityLogicalName = "crf10_employer"
                };

                var procOpp2Resp = (RetrieveProcessInstancesResponse)ConnectionManager.CrmService.Execute(procOpp2Req);
  • Update ‘activestageid’ field of the ‘BPF entity’ (i.e., Employer Flow) record fetched in previous step, with the desired stage GUID captured in Step #2.
// Declare variables to store values returned in response
                int processCount = procOpp2Resp.Processes.Entities.Count;
                var activeProcessInstance = procOpp2Resp.Processes.Entities[0]; // First record is the active process instance
                var _processOpp2Id = activeProcessInstance.Id; // Id of the active process instance, which will be used

                // Retrieve the process instance record to update its active stage
                ColumnSet cols1 = new ColumnSet();
                cols1.AddColumn("activestageid");
                Entity retrievedProcessInstance = ConnectionManager.CrmService.Retrieve("crf10_employerflow", _processOpp2Id, cols1);

                // Update the stage to 'Experience' by passing GUID (i.e.,"05aeaf03-e135-40ac-8ae7-cafc7d746a02") 
                retrievedProcessInstance["activestageid"] = new EntityReference("processstage", new Guid("05aeaf03-e135-40ac-8ae7-cafc7d746a02"));
                ConnectionManager.CrmService.Update(retrievedProcessInstance);
  • Open the record from the App and the stage should set to ‘Experience’.
‘BPF Test’ record’s stage set to ‘Experience’
  • Check the BPF records and you should see ‘Active Stage’ got set to ‘Experience’ (This is optional step and for your learning).
  • Below is the complete snippet.
                var queryEmployerBPF = new QueryExpression
                {
                    EntityName = "workflow",
                    ColumnSet = new ColumnSet(true),
                    Criteria = new FilterExpression
                    {
                        Conditions =
                        {
                            new ConditionExpression
                            {
                                AttributeName = "uniquename",
                                Operator = ConditionOperator.Equal,
                                Values = { "crf10_employerflow" }
                            }
                        }
                    }
                };
                var retrievedBPF = ConnectionManager.CrmService.RetrieveMultiple(queryEmployerBPF).Entities[0];
                var _bpfId = retrievedBPF.Id;

                var queryPS = new QueryExpression
                {
                    EntityName = "processstage",
                    ColumnSet = new ColumnSet(true),
                    Criteria = new FilterExpression
                    {
                        Conditions =
                        {
                            new ConditionExpression
                            {
                                AttributeName = "processid",
                                Operator = ConditionOperator.Equal,
                                Values={ _bpfId }
                            }
                        }
                    }
                };
                var retrievedPS = ConnectionManager.CrmService.RetrieveMultiple(queryPS);
// Copy the Stage GUID's using below loop.
                foreach (var stage in retrievedPS.Entities)
                {
                    Console.WriteLine($"Stage Name : {stage["stagename"]}");
                    Console.WriteLine($"Stage ID : {stage["processstageid"]}");
                }
//Create 'Employer' record
                var entEmployer = new Entity("crf10_employer");
                entEmployer["crf10_name"] = "BPF Test";
                //entEmployer["processid"] = Guid.Empty;
                var violationID = ConnectionManager.CrmService.Create(entEmployer);

                var procOpp2Req = new RetrieveProcessInstancesRequest
                {
                    EntityId = violationID,
                    EntityLogicalName = "crf10_employer"
                };

                var procOpp2Resp = (RetrieveProcessInstancesResponse)ConnectionManager.CrmService.Execute(procOpp2Req);

                // Declare variables to store values returned in response
                int processCount = procOpp2Resp.Processes.Entities.Count;
                var activeProcessInstance = procOpp2Resp.Processes.Entities[0]; // First record is the active process instance
                var _processOpp2Id = activeProcessInstance.Id; // Id of the active process instance, which will be used

                // Retrieve the process instance record to update its active stage
                var cols1 = new ColumnSet();
                cols1.AddColumn("activestageid");
                var retrievedProcessInstance = ConnectionManager.CrmService.Retrieve("crf10_employerflow", _processOpp2Id, cols1);

                // Update the stage to 'Experience' by passing GUID (i.e.,"05aeaf03-e135-40ac-8ae7-cafc7d746a02") 
                retrievedProcessInstance["activestageid"] = new EntityReference("processstage", new Guid("05aeaf03-e135-40ac-8ae7-cafc7d746a02"));
                ConnectionManager.CrmService.Update(retrievedProcessInstance);

🙂

Categories: CRM Tags: , , ,

Power Apps | Unable to query Canvas App using CDS WebAPI

September 27, 2020 Leave a comment

In my Power Apps Environment I have the following ‘Hello World’ Canvas App.

When I query ‘Canvas App’ entity using Web API from same Environment, there were no Canvas App records returned.

Reason:

  • If you create a Canvas App from ‘Apps’ tab outside of solutions or imported any App to the Environment, Canvas App will not be available in CDS until its added to a solution.
  • Open any of the unmanaged solution from the Environment, use ‘Add Existing’ and select the ‘Hello World’ Canvas App from ‘Outside solutions’ tab.
  • Once added a ‘Name’ gets populated and looks as below.
  • Now query the ‘Canvas App’ entity using Web API and now we should get the ‘Hello World’ Canvas App record.

🙂

Categories: PowerApps Tags: ,

Power Platform Admin Analytics

September 27, 2020 Leave a comment

In previous Dynamics versions, with the help of Organization Insights, available as a preferred solution from AppSource, we can get the organization level insights such as No of Active Users, API Calls, etc.

Now  these analytics can be viewed right from the Power Platform Admin Center with no need of additional solutions.

Currently there are 3 types of Analytics are available.

Who can view the Analytics reports:

Admins with the following roles and a license can view the analytics:

  • Environment Admin – can view reports for the environments that the admin has access to.
  • Power Platform admin – can view reports for all environments.
  • Dynamics 365 admin – can view reports for all environments.
  • Microsoft 365 Global admin – can view reports for all environments.

Refer this article for more details

🙂

Categories: CRM, PowerApps Tags: ,

Microsoft Teams and PowerApps – Project Oakdale

September 24, 2020 1 comment

Project Oakdale, a new built-in low-code data platform for Teams that provides enterprise relational datastores with rich data types to Teams users, is now in public preview.

Solutions built with Power Platform can be easily published to the Teams app store and can be used off the shelf or customized for specific needs.

Lets see how to embed a Power app in Teams.

  • Connect to Office 365 portal and check if you have ‘Teams’ enabled.
  • If no Teams available, go to Teams site and sign in with Office 365 credentials.
  • Go to Teams and click on Apps->Power Apps and click ‘Add’.
  • A ‘Power Apps’ tab adds to the Teams.
  • From ‘Power Apps’ , either create a new App by clicking ‘Create an app’ or pick any of the readily available apps.
  • Once you choose existing App, select a Team’s Channel where this App needs to be available. I chose ‘Hello World’ team’s ‘General’ channel.
  • App takes few minutes to complete the setup.
  • Grant the required permissions.
  • App loads as below and ready to be used.

Notes:

  • Project Oakdale environments are automatically created for the selected team when you create a Power app in Teams for the first time or install a Power Apps app from the app catalog. See About the Project Oakdale environment.

🙂

Categories: PowerApps Tags: ,

[Code Snippet] Using Microsoft.Pfe.Xrm library

September 20, 2020 Leave a comment

Pfe.Xrm library allows us to submit a bunch of XRM requests as batches (using ExecuteMultipleRequest) and execute them in parallel.

In this article, lets see how to use Microsoft.Pfe.Xrm library with a simple ‘Retrieve Request’.

What is Microsoft.Pfe.Xrm library?

  • Contains a set of common components for building solutions using the Dynamics 365 SDK and is developed by Microsoft Premier Field Engineering (PFE) and core engineering teams.

Download Microsoft.Pfe.Xrm

  • Use ‘Tools -> Nuget package manager’ from your Visual Studio to download the Nuget package.

Building blocks of Microsoft.Pfe.Xrm library

‘Microsoft.Pfe.Library’ allows us to trigger any ‘OrganizationRequest’ in batches (i.e.,ExecuteMultipleRequest) parallelly. Lets understand the building blocks of Pfe library.

  • List<OrganizationRequest>
    • As a first step prepare your ‘RetrieveMultipleRequest’ objects and add them to a List.
  • AsBatches() method
    • Call Pfe libraries AsBatches() method by passing the List prepared in previous step.
    • ‘AsBatches()’ returns IDictionary<string, ExecuteMultipleRequest> object.
  • OrganizationServiceManager
    • Trigger ‘Microsoft.Pfe.Xrm’ OrganizationServiceManager() method to create connection object.
    • There are multiple ‘OrganizationServiceManager’ overload methods.
    • In this article, I am using the following method.
  • ParallelProxy.Execute
    • This is the final method which accepts the ‘IDictionary<string, ExecuteMultipleRequest>’ object returned by AsBatches() method which we discussed in previous sections.
    • ParallelProxy.Execute also requires OrganizationServiceManager which we created in previous step.

IDictionary<string, ExecuteMultipleResponse> batchResponses = pfeXrmConnection.ParallelProxy.Execute(batchesOrgRequest);

Code Snippet

Now lets put all the building blocks together. It becomes as the following code snippet.

    public void RetrieveUsingPFE()
    {
        // Retrieve 'Account' and 'Contact'. You can prepare your list of entities.
        var listEntities=new List<>{ "account", "contact" };
        var batchRequests = new List<OrganizationRequest>();            

        int pageNumber = 1;
        string pagingCookie = null;

        // Filter for Active records. You can prepare your required filters as below.
        var filterActive = new FilterExpression(LogicalOperator.And);
        var condActive = new ConditionExpression("statecode", ConditionOperator.Equal, 0);
        filterActive.AddCondition(condActive);

        try
        {
            foreach (var entity in listEntities)
            {
                // Prepare 'Query Expression'
                var query = new QueryExpression(entity)
                {
                    ColumnSet = new ColumnSet(true),
                    PageInfo = new PagingInfo()
                };
                query.PageInfo.Count = 5000;
                query.PageInfo.PageNumber = pageNumber;
                query.PageInfo.PagingCookie = pagingCookie;

                query.Criteria.AddFilter(filterActive);

                var request = new RetrieveMultipleRequest
                {
                    Query = query
                };

                batchRequests.Add(request);
            }

            do
            {
                // Trigger Pfe.Execute
                var batchResponses = TriggerRequests(batchRequests);
                batchRequests = new List<OrganizationRequest>();
                pageNumber++;
                foreach (var responseItem in batchResponses)
                {
                    if (responseItem.EntityCollection.Entities.Count > 0)
                    {                            
                        var currEntityName = responseItem.EntityCollection.Entities[0].LogicalName;

                        foreach (var record in responseItem.EntityCollection.Entities)
                        {
                            // 'record' is the Retreieved record
                            // Perform your desired operations
                        }
                    }

                    if (responseItem.EntityCollection.MoreRecords)
                    {
                        var query = new QueryExpression(responseItem.EntityCollection.EntityName)
                        {
                            ColumnSet = new ColumnSet(true)
                        };

                        query.Criteria.AddFilter(filterActive);

                        var request = new RetrieveMultipleRequest();
                        query.PageInfo = new PagingInfo
                        {
                            Count = 5000,
                            PageNumber = pageNumber,
                            PagingCookie = responseItem.EntityCollection.PagingCookie
                        };

                        request.Query = query;
                        batchRequests.Add(request);
                    }
                }
            } while (batchRequests.Count > 0);
        }
        catch (Exception ex)
        {
            Console.WriteLine("Error occurred, error message is {0}", ex.Message);
            throw ex;
        }            
    }

    public List<RetrieveMultipleResponse> TriggerRequests(List<OrganizationRequest> batchRequests)
    {
        var retrieveRequestBatches = batchRequests.AsBatches(recordsPerRequest);
        var batchResponses = ExecuteParallelProxy(batchRequests);            
        var listResponses = new List<RetrieveMultipleResponse>();

        foreach (var key in batchResponses.Keys)
        {
            foreach (var result in batchResponses[key].Responses)
            {
                if (retrieveRequestBatches[key].Requests[result.RequestIndex] is RetrieveMultipleRequest)
                {
                    var originalRequest = (RetrieveMultipleRequest)retrieveRequestBatches[key].Requests[result.RequestIndex];
                    // Capture failed records
                    if (result.Fault != null)
                    {
                        Console.WriteLine($" Exception : {result.Fault.Message}");
                    }
                    else if (result.Response != null && result.Response is RetrieveMultipleResponse) // Capture success records
                    {
                        var responseItem = (RetrieveMultipleResponse)result.Response;
                        listResponses.Add(responseItem);
                    }
                }                    
            }
        }

        return listResponses;
    }

    private IDictionary<string, ExecuteMultipleResponse> ExecuteParallelProxy(IDictionary<string, ExecuteMultipleRequest> retrieveRequestBatches)
    {            
        IDictionary<string, ExecuteMultipleResponse> batchResponses = null;
        try
        {
            batchResponses = CRMPfeConnection.ParallelProxy.Execute<ExecuteMultipleRequest, ExecuteMultipleResponse>(retrieveRequestBatches);                
        }
        catch (AggregateException ae)
        {
            foreach (var ex in ae.InnerExceptions)
            {
                Console.WriteLine($"Aggregate Error in ExecuteParallelProxy : {ex.Message.ToString()}");
            }
        }
        catch (Exception ex)
        {
            Console.WriteLine($"Error in ExecuteParallelProxy : {ex.Message.ToString()}");
        }

        return batchResponses;
    }

    public static OrganizationServiceManager CRMPfeConnection
    {
        get
        {
            if (crmPfeConnection == null)
            {
                var UserName = ConfigurationManager.AppSettings["UserNameSrc"].ToString();
                var Password = ConfigurationManager.AppSettings["PasswordSrc"].ToString();
                var OrganizationUri = ConfigurationManager.AppSettings["OrganizationUriSrc"].ToString();
                ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
          crmPfeConnection = new OrganizationServiceManager(XrmServiceUriFactory.CreateOrganizationServiceUri(OrganizationUri), UserName, Password);
            }
            return crmPfeConnection;
        }
    }
  • Similar to ‘RetrieveRequest’, you can trigger all kinds of requests.

🙂

Power Apps | Custom connector | User does not have an entitlement to use PowerApps

September 2, 2020 Leave a comment

From the Dynamics trial instance, while registering a Custom Connector, encountered following issue.

Reason:

  • Error message states clearly, that the User who is trying to register ‘Custom Connector’ does not have ‘Power Apps’ license.
  • By default when you subscribe for trial , Dynamics only grants the ‘Dynamics 365 Customer Engagement Applications’ license which does not have ‘Power Apps’ option.
  • To check the User license, connect to Microsoft 365 Admin Center
  • Go to user’s ‘Manage product licenses’ tab.

Fix:

  • Grant the user Power Apps license to the user.
  • One option is by subscribing for the ‘Dynamics 365 Customer Engagement Plan‘ license by following below steps.
  • Connect to Microsoft 365 Admin Center
  • Go to ‘Billing -> Purchasing Services’.
  • Select ‘Dynamics 365 Customer Engagement Plan Trial’ and click on ‘Get free trial’.
  • You would get following confirmation page up on procurement.
  • Now open the User, and grant the ‘Dynamics 365 Customer Engagement Plan’ license.
    • Make sure ‘PowerApps for Dynamics 365’ is enabled.
  • You should be able to configure Custom Connector now.
  • Refer Power Apps license guide for more details.

🙂

Categories: PowerApps Tags: ,

SQL SERVER | Merge Statement | The insert column cannot contain multi-part identifiers

Other day while executing SQL ‘MERGE’ statement on 2 tables, I ran in to following error.

The insert column list used in the MERGE statement cannot contain multi-part identifiers

Before we understand the reason and fix for this error, lets get the basics of ‘MERGE’ statement.

‘MERGE’ statement:

  • ‘MERGE’ statement can be used, if you have 2 SQL tables and to synchronize both two tables by inserting, updating, or deleting rows in one table based on differences found in the other table.
  • ‘MERGE’ runs insert, update, or delete operations on a target table from the results of a join with a source table.

Step by Step using ‘MERGE’:

Lets understand how MERGE works with an example.

  • Create 2 tables, ‘Cust_Source’ and ‘Cust_Trg’. Column ‘CID’ acts as joining column between 2 tables.
‘Source’ and ‘Target’ tables
  • Lets use MERGE and sync ‘Cust_Trg’ using ‘Cust_Source’ table by covering following points:
    • Update the matching records in ‘Cust_Trg’ using the ‘Cust_Source’ rows.
    • Create (i.e., INSERT) the unavailable records in ‘Cust_Trg’ from ‘Cust_Source’.
    • Delete the excessive records from ‘Cust_Trg’ by comparing with ‘Cust_Source’ rows.
‘MERGE’ query
Query:

MERGE [dbo].[Cust_Trg] AS Trg
USING [dbo].[Cust_Source] AS Src
ON Trg.CID=Src.CID
WHEN MATCHED — Update the Target record fields from Source values.
THEN UPDATE SET Trg.Salary=Src.Salary,Trg.WorkExp=Src.WorkExp
WHEN NOT MATCHED BY TARGET — INSERT the record in Target as its not part of Target yet.
THEN INSERT (CID, [Name], Salary, WorkExp)
VALUES (CID, [Name], Salary, WorkExp)
WHEN NOT MATCHED BY SOURCE — Delete the record from Target as its not part of Source.
THEN DELETE

OUTPUT $action, –INSERT/UPDATE/DELETE
DELETED.CID AS TargetCID,
DELETED.Name AS TargetName,
DELETED.Salary AS TargetSalary,
DELETED.WorkExp AS TargetWorkExp,
INSERTED.CID AS SourceCID,
INSERTED.Name AS SourceName,
INSERTED.Salary AS SourceSalary,
INSERTED.WorkExp AS SourceWorkExp;

SELECT @@ROWCOUNT ‘No ofAffected rows’

  • Execute the query and you would get results as below.

Issue encountered with MERGE:

  • When I ran my initial MERGE query, ran in to following exception.
Don’t use Alias in ‘INSERT’
  • Issue was due to the usage of ‘Alias’ in INSERT statement (i.e., Trg.Salary, etc.)
  • Remove ‘Alias’ and rerun the query should solve the issue.
Notes:
  • MERGE statement works best when the two tables have a complex mixture of matching characteristics. For example, inserting a row if it doesn’t exist, or updating a row if it matches.
  • When simply updating one table based on the rows of another table, improve the performance and scalability with basic INSERT, UPDATE, and DELETE statements.

🙂

Categories: SQL Tags: , ,

Working with JSON data in SQL Server

July 26, 2020 1 comment

In this article, lets see how to work with JSON data in SQL Server using SSMS. Following features will be covered in this article.

  • Format SQL table data in JSON format (Using FOR JSON PATH clause)
  • Store JSON data in SQL table
  • Query JSON data in SQL server (Using JSON_VALUE and JSON_QUERY functions)
  • Update JSON data in SQL table (Using JSON_MODIFY function)
  • Convert JSON data in to SQL table format (Using OPENJSON function)
  • Validate JSON data format (Using ISJSON function)

Format SQL table data in JSON format:

Lets see how to format SQL query tabular results in JSON format using FOR JSON clause.

  • Create a simple ‘Employee’ table in SQL server.
  • When you query ‘Employee’ table you would get the result in tabular format.
  • Lets convert the result set in to JSON format using FOR JSON clause.
-- JSON PATH
select *
from Employee
FOR JSON PATH
  • Use WITHOUT_ARRAY_WRAPPER to get a single JSON object instead of an array. Use this option if the result of query is single object.
-- JSON PATH and WITHOUT_ARRAY_WRAPPER
select *
from Employee
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER
  • We can also form inline JSON object format. Lets see how to show ‘Address’ as a separate object.
-- JSON PATH, WITHOUT_ARRAY_WRAPPER and 'Address' as seperate object
select
ID,Name,
Street1 as [Address.Street1], Street2 as [Address.Street2],City as [Address.City], ZipCode as [Address.ZipCode]
from Employee
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER

Store JSON data in SQL table:

  • JSON is a textual format that can be used like any other string type in SQL Database. JSON data can be stored as a standard NVARCHAR.
  • In the example below SQL table has ‘JSONData’ column which is NVARCHAR(MAX).

Query JSON data in SQL server:

  • Lets query the JSON data using JSON_VALUE and JSON_QUERY.
  • JSON_VALUE:
    • JSON_VALUE function extracts a value from JSON text stored in the SQL column.
    • The extracted value can be used in any part of SQL query.
  • JSON_QUERY:
    • JSON_QUERY function extracts complex sub-object such as arrays or objects that are placed in JSON text.
    • Example ‘Tags’ column in above SQL Table.
Query records whose Color is ‘Yellow’

Update JSON data in SQL table:

  • JSON_MODIFY function can be used to update the JSON text without re-parsing the entire structure.
  • Below is the query to update the ‘Price’ of all ‘Yellow’ color products to 1000.
Update Price for records where Color is ‘Yellow’
Update Products
set JSONData = JSON_MODIFY(JSONData,'$.Price',1000)
where
JSON_VALUE(JSONData,'$.Color') = 'Yellow'
  • Post query execution, ‘Price’ updated to 1000 as shown below.

Convert JSON data in to SQL table format

  • Using OPENJSON function JSON data can be transformed in to SQL table format.
  • OPENJSON
    • Its a table-value function that parses JSON text, locates an array of JSON objects, iterates through the elements of the array, and returns one row in the output result for each element of the array.
Convert JSON data to Table format

Validate JSON data format

  • Since JSON is stored in a standard text, there are no guarantees that the values stored in text columns are properly formatted using ISJSON function.
  • ISJSON function returns the value 1 if the data properly formatted in JSON.
ISJSON(SQL_Cell) > 0

🙂

Categories: SQL Tags: ,