Tip #1449: Formula columns love calculated columns

Formula columns are great, just read Work with Dataverse formula columns. But there are some limitations in the Power Fx functions you can use, as well as with the return types..

  • Need a formula column returning a whole number? Make it a decimal with 0 precision.
  • Need to produce currency? Make a formula to return a decimal, then convert it to currency using the good old calculated column where you can define the return data type instead of inferring.
  • Need a choice? Your choice can only be either yes or no. But the same trick applies. Do your magic in a formula column, then create a calculated column and go through the painful if-then-else process ad nauseam to cover all possible return values. To be honest, I’d rather eat a frog and write a plugin.
Witch Mountain meme with the following dialog:

two characters in a car having a technical discussion, which could be imagined as:

Character 1 (top panel, speaking seriously): “Our Dataverse telemetry indicates formula columns aren’t being leveraged for generating monetary returns.”

Character 2 (middle panel, with a questioning expression): “Maybe it’s because currency formula columns can’t be produced?”

Character 1 (bottom panel, looking puzzled or skeptical): [No text, just the character's reaction]

Personally, I think it’s all because, ironically, Power Fx in formula columns does not support Power function. Source: trust me, dude.

Tip #1448: Never delete columns in translation

Another day, another guest Q&A post. (Have a tip of your own? Send it to jar@crmtipoftheday.com)

Question

I saw the Easy Translator  · XrmToolBox was updated, so thought I would give it another try on our app which is in 14 languages. Tanguy, can i delete rows and tabs that do not require translation? Or will it cause problems?

Steve “The Pot Stirrer” Mordue

Answer

Never delete columns. But you are free to delete rows and tabs

Tanguy “The XRM Toolbox” Touzard

Tîpp Jäår $0.02 + GST

What Tanguy was referring to is Excel file that you get when you export customized table and column text for translation with Power Apps. Tabs contain types of tranlsations required, rows refer to individual strings requiring translations while columns contain languages. So, yes, Steve would have to look at all 14 columns.

Tip #1447: Use Parse JSON for settings

I often see makers declaring multiple settings in a Power Automate flow using a single variable per setting.

The screenshot shows a sequence of three configuration steps in a Power Automate flow, each labeled with the action "Initialize variable." The first step initializes an integer variable named "Last Month" with the value 6, indicating the last month of the financial year. The second step initializes a string variable named "Default Currency" with the value "AUD," setting the Australian Dollar as the default currency. The final step initializes a float variable named "Maximum Writeoff" with the value 499.99, which could represent a maximum allowable write-off amount in the given financial context. Each step is represented with a distinct color code, suggesting a logical progression in the flow.

For complex flows, it can quickly get out of hands as variables must be declared on the top level; they cannot be nested or even scoped. An environment variable per each setting, you say? Way to make friends with the deployment admins!

Instead, use Parse JSON action and keep your flows compact by combining all related settings in one neat JSON object.


The screenshot displays a dialog box from Power Automate for setting properties in a 'Parse JSON' action. The content section includes a JSON object with three properties: "LastMonth" with a value of 6, "DefaultCurrency" set to "AUD", and "MaxWriteoff" set at 999.99. Below the content box, the schema section is partially filled out with JSON schema definitions corresponding to the above properties, indicating "LastMonth" as an integer and "DefaultCurrency" as a string. There is also a button to "Generate from sample" to create the schema based on a sample JSON content.

Accessing settings after that is intuitive and self-explanatory. And you get a Body as a bonus.

The image is a screenshot of a dynamic content selector within a Power Automate workflow. The panel shows a list of variables and settings from the workflow, indicating where they can be used as dynamic content in subsequent actions. Variables displayed include "Maximum Writeoff," "Default Currency," and "Last Month." Under the settings, items such as "Body," "LastMonth," "DefaultCurrency," and "MaxWriteoff" are available for selection. The interface elements such as search bar, information, and close icons, along with the expansion arrows for each section, suggest that the user can interact with this panel to insert these dynamic values into other parts of the workflow.

Tip #1445: Date picker in Power Pages revisited

The product has been renamed at least twice since we talked last time about how to restrict the date picker control in now Power Pages. But most importantly that code, according to some reports, no longer works. Fear not, here’s the revised version that works perfectly well in Power Pages (it did 5 minutes ago, anyway).

In this example we have two date controls. We want to restrict the first one to the future dates only and the second one to give us interval of no more than two weeks.

Add the following script to the page containing the form.

function getDatePicker(name) {
    // return the datepicker for the given column name
    return $('#' + name)
        .siblings('.datetimepicker')
        .data('DateTimePicker');
}

function initDates(from, to) {
    console.log('initDates', from, to);
    
    // default FROM is today
    var fromDate = from || moment().startOf('day');
    
    // default required is in 7 days
    var toDate = to || fromDate.clone().add(7, 'days');
    
    getDatePicker('enabler_datesubmitted').date(fromDate);
    getDatePicker('enabler_daterequired').date(toDate);
}

function dateonchange() {
    var pickerFrom = getDatePicker('enabler_datesubmitted');
    var pickerTo = getDatePicker('enabler_daterequired');
    var from = pickerFrom.date();

    // no past FROM date
    pickerFrom.minDate(moment().startOf('day'));
    
    // force TO date between the next day and 2 weeks
    if (from && from.isValid()) {
        var fromplusone = from.clone().add(1, 'days');
        var fromplus14 = from.clone().add(2, 'weeks');
        pickerTo
            .minDate(fromplusone)
            .maxDate(fromplus14);
    }
    else {
        pickerTo.minDate(null);
    }
}

$(document).ready(function () {
    $("#enabler_datesubmitted").next()
        .on("dp.change",
            function (e) { dateonchange(); })
    initDates();
});

And this CSS to fine-tune the accessibility:

 .day.disabled > button {
    text-decoration: line-through;
    cursor: not-allowed;
    color: lightgrey;
 }

The end result: first control only allows selection from today onwards and the second one is restricted from tomorrow until two weeks from now.

The image shows two date picker calendars side by side. The left one is labeled "Date Submitted" with the date March 4, 2024, highlighted in red and pointed at with a red arrow, indicating selection. The right calendar is labeled "Date Required" with the date March 11, 2024, circled in red and also pointed at with a red arrow, indicating another selection. Both calendars display the month of March 2024, with days of the week labeled from Sunday to Saturday on the top. The "Date Required" calendar is slightly faded compared to the "Date Submitted" calendar, suggesting a sequential relationship or a restriction where the "Date Required" must be after the "Date Submitted".

Tip #1444: New Power Automate UI unearths hidden property

Today’s tip is from Matt Collins-Jones. (Have a tip of your own? Send it to jar@crmtipoftheday.com)

I (MCJ that is – t.j) was building a flow yesterday and I was using the manual flow trigger and wanted to upload a document/file. I was in the classic UI and I noticed I couldn’t get the name of the file via dynamic content. I thought this was a strange limitation, but I was only building something for demo purposes.

The image shows a screenshot of a Power Automate action configuration titled "Add attachment". It is part of a flow setup where fields for 'Site Address', 'List Name', 'Id', 'File Name', and 'File Content' are displayed with corresponding input fields. The 'File Name' and 'File Content' fields contain dynamic content represented by icons, labeled 'file.name' and 'FileUpload', respectively. There is a red arrow pointing to the 'File Name' field, suggesting attention is needed there. On the right side of the screenshot, there is a panel labeled "Dynamic Value" with a search bar, below which are listed other dynamic content options like 'Country/Region', 'City', 'State', 'Street', 'Postal code', and 'FileUpload'. At the bottom, the user has added multiple question marks, indicating confusion or the need for assistance regarding what dynamic content to select or enter for the 'File Name' and 'File Content' fields.

I came back to the flow and after having saved my flow, it had opened up in the new UI. When I went to look at the dynamic content, I noticed the name of the file content was different, so I opened the dynamic content pane and found a property not found in the classic UI.

The image is a screenshot of a Power Automate modern action setup interface for "Add attachment" within a flow. The interface is displaying fields for 'Site Address', 'List Name', 'Id', 'File Name', and 'File Content'. The 'File Name' and 'File Content' fields are both filled with a dynamic content token labeled 'FileUpload'. There is a red arrow pointing from the 'Manually trigger a flow' sidebar on the right, highlighting the dynamic content 'FileUpload name' as the likely input for the 'File Name' field. The sidebar also shows 'FileUpload contentBytes', suggesting this as the input for the 'File Content' field.

The property appears in the JSON for the trigger, but not in the classic UI, so maybe the new UI is just smarter at reading the JSON schema to produce its list of dynamic content than the old UI. We may find more previously hidden properties in the new UI.

The image displays a JSON schema code snippet used to manually trigger a flow in Power Automate. The schema defines an object type with properties for a file, which is itself an object. The file object includes properties such as "title" with the value "File Content", and a description prompting to "Please select file or image". There are hints indicating the content type expected is a file, with further nested properties for "name" as a string and "contentBytes" as a string with a format specified as "byte". The "required" field at the bottom of the schema is an empty array, suggesting there are no required properties for this trigger.

Tîpp Jäår $0.02 + GST

Name/contentBytes duo looks familiar for anyone who ever sent attachments in Power Automate. No longer having the property [accidentally?] hidden goes a long way to create a maker-friendly UI. A small plus for new UI (game is still on though!).

Cover image is a courtesy of a conversation with AI. First, it generated a dramatic image then the description to go with it. Level of drama is unbearable.

The image illustrates a stark contrast between two versions of the Power Automate maker interface. On the left side, ancient Egyptian hieroglyphs are carved into a stone wall, bathed in a warm, golden glow emanating from a sun-like source, representing an ancient user interface. On the right side, a futuristic and advanced user interface is depicted with vibrant neon circuitry, transparent screens, and augmented reality elements, all glowing against a dark background, symbolizing cutting-edge technology. The two sides are dramatically juxtaposed to highlight the evolutionary leap in technology from past to future.

The image illustrates a stark contrast between two versions of the Power Automate maker interface. On the left side, ancient Egyptian hieroglyphs are carved into a stone wall, bathed in a warm, golden glow emanating from a sun-like source, representing an ancient user interface. On the right side, a futuristic and advanced user interface is depicted with vibrant neon circuitry, transparent screens, and augmented reality elements, all glowing against a dark background, symbolizing cutting-edge technology. The two sides are dramatically juxtaposed to highlight the evolutionary leap in technology from past to future.

Tip #1443: Let SharePoint deal with rogue data

Using Power Automate to import data into an existing SharePoint list is fairly straightforward, except when the data isn’t very clean. For instance, if a number column contains blank values or some non-numerical garbage, SharePoint connector will flip out.

OpenApiOperationParameterTypeConversionFailed

The ‘inputs.parameters’ of workflow operation ‘Create_item’ of type ‘OpenApiConnection’ is not valid. Error details: Input parameter ‘item/Interest’ is required to be of type ‘Number/double’. The runtime value ‘””‘ to be converted doesn’t have the expected format ‘Number/double’.

OpenApiOperationParameterValidationFailed

The ‘inputs.parameters’ of workflow operation ‘Create_item’ of type ‘OpenApiConnection’ is not valid. Error details: The API operation ‘PostItem’ requires the property ‘item/Interest’ to be greater than or equal to ‘-1.7976931348623157E+308’ but is ‘NaN’.

— SharePoint Connector

The first instinct might be to wrap your data in if(empty()) construct or use clever error handling for invalid data. This approach works unless you’re dealing with a large number of columns of similar “quality”. A workaround is letting SharePoint handle the rogue data by changing the column type to a single line of text, importing the data, then changing it back to a number, which cleans out all invalid entries, leaving only valid numbers.

Cover image courtesy of AI. Ignoring “Denitent Cemmnieagl Datter”, not bad.

Tip #1442: Security for beginners

We talk about security around Dataverse and Power Platform from time to time. We even dabble into platform agnostic security tips. Today is all about vendor-agnostic cybersecurity.

Learn the fundamentals of identity management, zero trust, AppSec, and data security in this new 7-lesson open source course, “Security for Beginners” created by Microsoft Cloud Advocates. Each lesson should take around 30-60 minutes to complete and will help kick-start your security learning.

https://aka.ms/sec101-beginners

Tip #1441: Owner is a user is a team

WARNING: Niche L400 tip ahead.

tl;dr

If you’re expect primary column name to be tablename + “id“, add an exception for systemuser and team tables – both will come back as ownerid in custom API.

L400

If custom API has an output property of type EntityReference then, for example, for account (or any other “normal”) table, the following json is returned as expected:

"Account": {
   "@odata.type": "#Microsoft.Dynamics.CRM.account",
   "accountid": "87aa5cfb-197c-ea11-a813-000d3abeef18"
 }

Save yourself a trip to metadata and use tablename + “id” for the primary column name, right? Not so fast. For either systemuser or team, you get ownerid instead of expected systemuserid or teamid

"User": {
   "@odata.type": "#Microsoft.Dynamics.CRM.systemuser",
   "ownerid": "889362cfb-22aa-eaea-abba-dead3a6a9942"
}

No one knows dev docs better than Jim Daly, his patience to express RTFM sentiment is fascinating.

In Web API ownerid is the primary key for the systemuser EntityType and the team Entitytype because both inherit this property from the principal EntityType, which enables the capability for either type to be owners of records.

— Jim Daly

The ownerid value is the same as the respective systemuserid or teamid columns.

EDIT: the same applies to any activity table, e.g. task, email, appointment, etc, because any table that is configured as an activity inherits from the activitypointer entity type. The primary key for any activity will be activityid, according to Web API EntityTypes – Power Apps | Microsoft Learn.

Tip #1440: Relative dates in sample data

We all enjoy flexing our Dataverse or Dynamics 365 skills from time to time. Many of us have meticulously crafted a demo instance from proverbial stone and blood. However, one of the challenges is keeping the data up-to-date. There’s nothing worse than displaying 20 tasks that are all overdue by four months. Fortunately, Dataverse offers a fantastic feature that allows us to modify data during import! Grab yourself a cuppa, blanket, comfortable horizontal surface and dive in: Add transformation mappings for import (Microsoft Dataverse) – Power Apps | Microsoft Learn.

Screenshot of an Excel spreadsheet. Selected cell has the following formula in it: NOW() + 14.

Alternatively, you can simply use an Excel file with formulas while following the instructions: Import or export data from Microsoft Dataverse – Power Apps | Microsoft Learn. Boing!

For fancier, more modern transformations you may want to look into Importing using a connector and Power Query.

 #"Added Two Weeks" = Table.ReplaceValue(
    #"Changed column type",
    each [Due Date], 
    each DateTime.From([Due Date]) + #duration(14,0,0,0), 
    Replacer.ReplaceValue, {"Due Date"})