Archive
[Code Snippet] PowerShell | Azure DevOps | Query Variable Group using API
In Azure DevOps, Variable groups store values and secrets that you might want to be passed into a YAML pipeline or make available across multiple pipelines.
Variable Group is a collection of variables with Name and Value pair.
Below is my variable Group ‘BillingSystemDictionary’ with 3 variables in my DevOps organization ‘RajeevPentyala0011’ and project ‘Demo’ (These details are required in PowerShell script).
In DevOps pipeline scenarios, you may need to fetch the variables under a variable group. Following is the PowerShell script which calls DevOps API and fetch variables by variable group name.
# Form API authentication Header
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Bearer $(System.AccessToken)")
$headers.Add("Content-Type", "application/json")
# Pass Variable Group Name and read variables
$variableGroupName = "BillingSystemDictionary"
# 'projectid' pattern is https://dev.azure.com/{organization_name}/{projectname}
$projectid = "https://dev.azure.com/RajeevPentyala0011/Demo"
$variableGroupGetUrl = "$projectid/_apis/distributedtask/variablegroups?groupName=$variableGroupName*&queryOrder=IdDescending&api-version=6.0-preview.2"
$queryResponseObject = Invoke-RestMethod -Uri $variableGroupGetUrl -Method GET -Headers $headers
# $queryResponseObject will be of type pscustomobject
# To fetch the variable value of "isactive"
$variableName = "isactive"
$variableValue = $queryResponseObject.value.variables.$variableName.value
Write-Host "$variableName value is $variableValue"
🙂
Power Platform | Data source environment variables
Recently, I got a deployment (ALM) specific query from my SharePoint colleague who was exploring Power Platform’s Canvas Apps.
Query:
- SharePoint team got 3 Sites (i.e., Dev/QA/Prod) and they also use 3 different Power Platform environments for Development, QA, and Production.
- On Development environment, a Canvas App was built connecting to ‘Dev’ SharePoint site.
- When the ‘Canvas App’ moved to QA and Production environments using Solution import, it must connect to the ‘QA’ and ‘Prod’ SharePoint sites.
- Following diagram depicts the topology.
Solution:
- It’s pretty a common scenario and one option would be post solution import to QA/Prod environments, reestablish the Connection reference manually. This option is time consuming and not intuitive.
- Other recommended option is “Data source environment variables”. We will see the ‘How’ part in next section.
Data source environment variables:
- As we know, Environment variables store the parameter keys and values, which then serve as input to various other application objects.
- Environment variables are now much simpler to work with and can store parameters required to connect to data.
In this article, lets see how to create and consume Data source environment variables in Canvas App.
Configure Data source environment variables:
Create Environment variables:
In this article, I am using SharePoint site to connect from Canvas App.
- Create a SharePoint site if not exists already and add a list using native ‘Issue Tracker’ template (You can also use any List of your choice).
- From Power Apps maker portal, connect to Dev environment and create a new solution.
- Create a new ‘Canvas App’ from the Solution.
- Now, from Canvas App, go to ‘File-> Settings’, enable the setting to Automatically create environment variables when adding data sources.
- Add a SharePoint data source and configure the Site and List.
- Once you add the ‘SharePoint’ data source, you would notice a banner message, indicates Environment Variables were auto created.
- Click on the ‘Manage them here’ will redirect you to the Solution where you can access the auto created ‘Environment Variables’.
- That’s all needed to create Environment Variables.
Configure Environment variables during Solution Export and Import:
Exporting Solution:
- Before exporting the solution, select one of the Environment variables and choose ‘Edit’ and select ‘Remove from this solution’.
- This is step is important. You will not be prompted for new values during solution import if the environment variables already have either a default value or value present, if not chosen the ‘Remove from this solution‘.
- Repeat this step for other Environment variables.
- Export the solution.
Importing Solution:
- Select the target environment (I.e., QA in my case) and choose import the Solution.
- You will get following ‘Connections’ pane which will allow you to configure to the QA SharePoint site.
- You will be redirected Connections screen to establish connection to QA SharePoint site. Once you establish connection, it looks as below.
- Go back to Solution import process and map the new connection configured in above step.
- In the next step of import, system will ask you to provide values for the Environment variables (i.e., Site, List).
- You would not get above Connection reference and Environment Variables panes, if you have not chosen the ‘Remove from this solution‘ option.
This approach eliminates manual step of reestablishing connection post solution import to Target environment.
Important Points:
- ‘Data Source Environment’ variables doesn’t require querying environment variables via Dataverse connector or require premium licenses.
- The Environment variables can be unpacked and stored in source control using DevOps Power Platform tools.
- We can also use ‘Pre-existing’ data source Environment variables by creating a new ‘Environment Variables’ of ‘Data Type’ of ‘Data source’.
- Use this option, if you don’t want auto creation of Environment Variables (i.e., By enabling the setting to Automatically create environment variables when adding data sources).
Refer docs for more details.
🙂
[Step by Step] Model-driven apps | In-app notifications
Using in-app notification feature (Preview) we can notify Users with in the Model-driven apps. In this article lets learn how to enable the feature and use the in-app notifications with different options step by step.
Enable In-app notifications feature:
- Connect to Power Apps Maker Portal.
- Open the ‘Model Driven App’ using ‘Edit in preview’ option.
- In the App Editor, select ‘Settings -> Upcoming’ and enable the In-app notifications (Preview) as below.
Create a Test User:
- To test the In-app notifications, create a new user account. This User acts as recipient of notification.
- Note: If you already have more users in your environment, skip this step and use one of the existing Users as recipient.
- I’ve created an User named ‘Test User 1’ to send the notifications to.
- Open the Model driven app in another browser using the Test User account.
- Next copy the ‘Test User 1’ GUID by opening the User from ‘Advanced Find’. We will use this GUID while creating notification.
As we completed pre-requisites, its time to create notifications.
Creating our first notification:
- Notification features uses an OOB table called “appnotification”.
- We create a record in ‘appnotification’ table which turns out to be a notification for targeted User.
- Each notification row is meant for a single user, identified by the Owner column value.
- If a notification needs to be sent to multiple users, a record needs to be added for each recipient.
- The sender controls the recipient through the Owner column.
Following is code snippet is using ‘Client API’ to create a basic notification.
function showBasicNotification() {
// GUID of User (i.e.,Test User 1 in my case) whom you want to send the Notification to.
var systemuserid = "0758fd55-c2ca-ec11-a7b5-00224808dd66";
var notificationRecord =
{
"title": "Welcome",
"body": "Welcome to the world of app notifications!",
"ownerid@odata.bind": "/systemusers(" + systemuserid + ")",
"icontype": 100000000, // info
"toasttype": 200000000 // timed
}
// Create notification record
Xrm.WebApi.createRecord("appnotification", notificationRecord).
then(
function success(result) {
console.log("notification created with ID: " + result.id);
},
function (error) {
console.log(error.message);
}
);
}
- Register the showBasicNotification() function on one of the form jscript events (i.e., On Load/On Save/Field_OnChange).
- For this blog post, I’ve registered script on ‘On Load’ event.
- Use Browser ‘DevTools’ (F12) to check the successful execution of function.
- Up on executing the script, end user (i.e., Test User 1) would get a notification as below.
Notification with Action(s):
- Using “data” tag we can create notifications that include actions, custom body definitions, and custom icons.
- Following code snippet contain 2 actions (i.e., 2 URLs to navigate to) in Notification.
- ‘actions’ is an array.
function showNotificationWithMultipleActions() {
// Notification with multiple actions as center dialog
var systemuserid = "0758fd55-c2ca-ec11-a7b5-00224808dd66";
var notificationRecord =
{
"title": "Different Search Options",
"body": "This is to inform you that you can use following search options",
"ownerid@odata.bind": "/systemusers(" + systemuserid + ")",
"icontype": 100000000, // info
"data": JSON.stringify({
"actions": [
{
"title": "Bing",
"data": {
"url": "https://bing.com",
"navigationTarget": "dialog"
}
},
{
"title": "Google",
"data": {
"url": "https://google.com",
"navigationTarget": "dialog"
}
}
]
})
}
Xrm.WebApi.createRecord("appnotification", notificationRecord).
then(
function success(result) {
console.log("notification created with multiple actions: " + result.id);
},
function (error) {
console.log(error.message);
}
);
}
- Up on execution, End User gets a notification with 2 navigation actions as below.
Notification with a custom Title and Body:
- The data field supports overriding the Title and Body simple strings with a limited subset of markdown based.
- Below is the supported markdown.
Text Style | Markdown |
---|---|
Bold | **Bold** |
Italic | _Italic_ |
Bullet list | - Item 1\r- Item 2\r- Item 3 |
Numbered list | 1. Green\r2. Orange\r3. Blue |
Hyperlinks | [Title](url) |
- Newlines can be included with the body using
\n\n\n\n
. - Following is a code sample using ‘Markdown’ to customize Title and Body.
- Under ‘actions’ tag, url is pointing to an existing record.
function notificationWithCustomTitleAndBody() {
var systemuserid = "0758fd55-c2ca-ec11-a7b5-00224808dd66";
var notificationRecord =
{
"title": "Example of Custom Title and Body",
"body": "Maria Campbell mentioned you in a post.",
"ownerid@odata.bind": "/systemusers(" + systemuserid + ")",
"icontype": 100000004, // mention (Displays @ symbol)
"data": JSON.stringify({
"title": "[Potential Prospect](?pagetype=entityrecord&etn=raj_prospect&id=48f5d750-cbca-ec11-a7b5-00224808dd66)",
"body": "Here is a [Potential Prospect](?pagetype=entityrecord&etn=raj_prospect&id=48f5d750-cbca-ec11-a7b5-00224808dd66)",
"actions": [
{
"title": "View record",
"data": {
"url": "?pagetype=entityrecord&etn=raj_prospect&id=48f5d750-cbca-ec11-a7b5-00224808dd66"
}
}
]
})
}
Xrm.WebApi.createRecord("appnotification", notificationRecord).
then(
function success(result) {
console.log("notification created with custom title and body: " + result.id);
},
function (error) {
console.log(error.message);
}
);
}
- Up on execution End User receives notification as below.
Notification with a custom Icon:
- Within the notification, set iconType to Custom and in the body, include iconUrl with a value pointing to a web resource.
- “icontype”: 100000005, // custom
- “data”: “{ ‘iconUrl’: ‘/WebResources/cr245_AlertOn’}”
- The icon can be either an SVG or PNG file type.
- Following is the ‘notificationRecord’ object.
var notificationRecord =
{
"title": "Welcome",
"body": "Welcome to the world of app notifications!",
"ownerid@odata.bind": "/systemusers(" + systemuserid + ")",
"icontype": 100000005, // custom
"data": "{ 'iconUrl': '/WebResources/cr245_AlertOn'}"
}
- Notification with Custom icon looks as below.
Key points on Notification usage:
- If you don’t specify “ownerid@odata.bind” in ‘notificationRecord’ object, notification will go to the User who created Notification.
- Notification Polling:
- In-app notifications uses polling to retrieve notifications periodically when the app is running.
- New notification are retreived at start of the model-driven app and when a page navigation occurs as long as the last retreival is more than one minute ago.
- If a user stays on a page for a long duration, new notifications will be retrieved.
- Notifications appear in the notification center until the recipient dismisses them or they expire. By default, a notification expires after 14 days but your administrator can override this setting
- You can change in-app notification behavior by setting Toast Type to one of the following values.
Toast Type | Behavior | Value |
---|---|---|
Timed | The notification appears for a brief duration (the default is four seconds) and then disappears. | 200000000 |
Hidden | The notification appears only in the notification center and not as a toast notification. | 200000001 |
- You can change the in-app notification icon by setting Icon Type to one of the following values.
Icon Type | Value | Image |
---|---|---|
Info | 100000000 | ![]() |
Success | 100000001 | ![]() |
Failure | 100000002 | ![]() |
Warning | 100000003 | ![]() |
Mention | 100000004 | ![]() |
Custom | 100000005 |
🙂
Dynamics 365 and Microsoft Power Platform | Release planner
As the Dynamics Release Wave 1 2022 is being deployed this month, if you are planning for an upgrade of your existing system, Release planner tool (BETA) helps you to organize and plan for New and Upcoming features.
- To get started, first create an account. Alternately you can connect using your Work Account.
- Pick the feature you want to try add that to your plan by clicking ‘+ To my plan’.
- Click ‘Learn more’ to understand about the feature.
- Once you add the Features, go to ‘My Release Plans’ for collated list.
- Refer this article by Microsoft FastTrack Team on how to plan and approach the products release wave upgrades.
🙂
[Step by Step] Using JSLint extension in VSCode
In this article, lets see how to Install and use JSLint extension in VSCode.
What is JSLint:
- JSLint is a JavaScript program that looks for problems in JavaScript programs. It is a code quality tool.
- JSLint takes a JavaScript source and scans it. If it finds a problem, it returns a message describing the problem.
Install JSLint extension in VSCode:
- From the VSCode ‘Extensions’, search for ‘JSLint’ and Install.
- Next, we need to install ‘jslint’ Globally by running ‘npm install -g jslint’ command from Terminal.
- All the required files gets installed and Terminal looks as below.
Run JSLint on a JavaScript File:
- Open the JavaScript file, which you want to validate in VSCode.
- Open the Terminal and run ‘jslint {file_name}’ command. In my case, its ‘jslint HelloWorld.js’.
- Script violations will be listed in Terminal as below.
🙂
Dataverse | New Auditing Features
Audit data is now stored separately from customer records so an organization’s audit log can grow to many terabytes in size without limiting available Database capacity.
Audit Retention Policy:
- In the ‘Admin Center’, on the ‘Environment’ page now we have ‘Auditing -> Manage’ option to set the Audit Retention Policy.
- Select one of the options from the dropdown to set the retention duration.
- Audit records are automatically deleted after the retention period is over, relieving administrators from the burden of deleting audit data manually.
- It is also possible to keep the audit log indefinitely by not specifying a retention period.
Free up capacity:
- With the new audit deletion options, Administrators can delete the logs of one or more audited tables, delete the user access logs, or delete logs up to a specific date in order to free up storage space.
Refer the docs link for more insights.
Dataverse | Plugins | ILMerge Alternative | Shared Project
Role of ILMerge in Plugin Development:
If you are familiar with writing Plugins in Dataverse, chances are that you would have used ILMerge to merge the Assemblies.
In a typical Dynamics Plug-in development, we will have following .Net Class Library Projects.
- Plugins.csproj
- Plugins.Helper.csproj
When you compile above projects, you get two .dlls (i.e., Plugins.dll and Plugins.Helper.dll). As we can only deploy one .dll to Dataverse, we use ILMerge to merge both Plugins.dll and Plugins.Helper.dll in to one.
Is it recommended to use ILMerge in Plugin development? Answer is No. As per this Microsoft article ILMerge is not supported in Plugins development.
Now whats the alternative? Answer is Shared Projects.
Steps to use Shared Projects in Plug-in Development:
To explain the Shared Projects, I am going to build a simple Plugin project ‘Plugins.csproj’ with ‘PreAccountCreate’ class, which refers a ‘Shared’ Helper project ‘Plugins.Helper’.
- Create a new C# class library project and add ‘PreAccountCreate’ class as below.
- You can copy the code I’ve used from here.
- Now lets add a ‘Shared’ Helper project which our Plugin project would refer.
- Right click your Solution and click ‘New Project’.
- Select a Project template of type C# ‘Shared Project’.
- Give a Name to your ‘Shared Project’.
- I’ve named it as ‘Plugins.Helper’.
- ‘Plugins.Helper’ Shared Project, looks as below in the Solution Explorer.
- Now add a Class file ‘AccountHelper.cs’ to the ‘Shared Project’.
- I’ve added a simple function ‘GetAccountName()’ which returns ‘Microsoft India’.
- To use the ‘Shared Project’ in our Plug-in project, right click ‘Plugins’ project and add ‘Reference’.
- From ‘Shared Projects’, choose your project (i.e., Plugins.Helper in my case).
- Once you referred the ‘Shared Project’ in your Plugin project, it looks as below.
- Now its time to call ‘GetAccountName()’ from our ‘PreAccountCreate’ class.
- Sign the Assembly.
- Build the Plug-in project and you would get the ‘Plugins.dll’ in bin/Debug folder.
- Go ahead and deploy the ‘Plugins.dll’ to Dataverse and register step using Plugin Registration Tool.
🙂
Dataverse | Modernize business units (Public Preview)
With this preview feature,
- Users are no longer restricted to accessing/managing data in their own business unit.
- Users can now access and own records across business units.
- Security roles from different business units can be assigned to the users regardless of the business unit the users belong to.
- This feature enables the Owning Business Unit column of the record so that it can be set/updated by the users.
- The Owning Business Unit column determines the business unit that owns the record.
- Business units’ data access can now support “Matrix data access” structure.
Modernize Business Unit – Feature Goals:
Modernize Business Unit – Design Elements:
New Organization Settings while Changing the User’s Business Units:
Refer following videos to learn more about this preview feature:
🙂
Dynamics CRM | Microsoft Power Apps | Legacy OData v2.0 Service removal date
OData v2.0 endpoint:
- The Organization Data Service (Also known as the OData endpoint or the REST endpoint when it was released, the Organization Data Service offers limited capabilities to only create, retrieve, update, or delete table data) is an OData v2.0 endpoint introduced with Dynamics CRM 2011.
- The Organization Data Service was deprecated with Dynamics 365 Customer Engagement v8.0 in favor of the Web API, an OData v4.0 service.
- Organization Data Service is planned to remove on November 11, 2022. Any code that uses the Organization Data Service should be migrated to use the Web API before that time.
- Its recommended to Use the Microsoft Dataverse Web API.
Action Required:
- Use the Solution Checker to detect any JavaScript web resource code. The rule web-avoid-crm2011-service-odata should detect use in client-side code.
- Check any other code, including PowerShell scripts, that send requests to this endpoint:
/xrmservices/2011/organizationdata.svc
. - Check any Power BI reports or Excel Data sources that may be using this endpoint.
Refer this link for more details.
🙂
Dataverse | New Text Formats | json, richtext
Text in Dataverse is a data type that can store a max of 4000 characters. Text has multiple formats that instruct the UI to treat it differently.
As an example, Email is a text format tells the client to treat the contents of the field as an email. It can display the data as a link that, when clicked, launches your default email client and inserts the address in the To: field.
There are 2 new formats, json and richtext have been introduced.
Json
- json format will store text strings in a json format.
- This format will not perform any validations on the correctness of the json. It simply allows you to store, view, and retrieve the content with json markup.
- This is currently limited for use on non-SQL tables (like Data Lake).
Richtext
- richtext format will allow the use of markup tags to format your text when viewed in a compatible control.
- By setting this value you can enable richtext for any other text or multiline text column.
- A control is coming soon in the Canvas and Model Driven space that will be used whenever a column indicates it is richtext.
Please refer this article for more details.
🙂