Tip #1405: Unable to assign license to a guest user

B2B collaboration is great. For example, you can share a canvas app with guest users or invite users to use with Azure Active Directory B2B collaboration . One of the steps in the process is adding B2B collaboration users in the Azure portal. But when you try adding a license assignment you maybe faced with the unexpected “License assignments failed”:

Azure "Update license assignments" screen with the user attempting to assign Power BI (free) license to the guest user. Error message "License assignment failed" is displayed.

Why would assignment of a free license fail? The reason is revealed when you click “License assignments failed” link:

Screenshot with the additional error details displayed. Words "License cannot be assigned to a user without a usage location specified" are highlighted

License cannot be assigned to a user without a usage location specified

Albert Einstein, 1948

That’s right, just edit the user profile and tell Azure where the user is and you’ll be able to assign the license, free or otherwise.

Cover photo by Elijah Macleod on Unsplash

Tip #1404: App secrets that last longer than 2 years

Using service principals is the great way to ensure that Dataverse connections in your Power Automate flows are not user-dependent and nothing is going to break when the original creator’s account gets disabled. These are good step-by-step instructions: How to setup a Common Data Service Application User (and why) (I’ll give Nick Doelman a separate Spießrutenlaufen later on for granting permissions nobody needs).

One of the steps is to create a secret and as of lately, the maximum expiration date can only be up to 2 years in the future. (And no, using custom is not going to help). But what if we want to last longer?

Azure dialog to add a client secret. Expiration date dropdown is expanded showing that the maximum expiration date can only be up to 2 years in the future

Yes, it’s a good idea to rotate your secrets. No, it’s not a good idea to do so until there are tools to automatically update all connections / connection references in your environment to use the new secret (managed identities / Azure Vaults, anyone?). I was indirectly told that Isaac Stephens figured it all out and it is indeed possible. Turns out, it’s not that complicated.

  1. Navigate to your app registration in Azure portal. (I trust you know how to do that as you presumably created that registration before, right?)
  2. Get your object id. (I knew it’ll come handy one day!)
    Screenshot of app registration properties in Azure portal. Object ID property is highlighted.
  3. Open Azure CLI prompt and sign in. I like using in-place prompt.
    Screenshot of Azure portal page with an arrow pointing to the icon that launches Azure CLI prompt directly in browser.
  4. Enter this command to regenerate a secret that expires in 42 years.
    az ad app credential reset --id 42937242-dead-beef-dead-42d833e1cb42 --years 42
  5. Don’t forget to copy the generated secret from the output – that will be displayed only once!

For additional syntax see az ad app credential | Microsoft Docs.

Cover photo by Osama Elsayed on Unsplash

Tip #1402: Why you should not use Microsoft Lists/SharePoint Lists for business-critical Power Apps

SharePoint lists (and now Microsoft Lists) are an easy way to quickly create lists of data for use in a Power App. Makers can quickly create lists, add fields, and populate data for canvas apps.

However, not all apps are equal from a complexity perspective.

If the app is simple and contains data that is not sensitive, Microsoft Lists can be a good choice for data storage for canvas apps.

However, if your app contains many different lists, large amounts of data, complex relationships between tables, and sensitive data for which all of the users of your ap should not have access, then you should consider using alternative data sources for you canvas app.

Here are the top 5 reasons why you should not use SharePoint lists for high-criticality Power Apps:

  1. It is not a relational database. Each list is independent and SharePoint lists were never intended to be used as a relational database. While SharePoint does have a field type called Lookup, it does not serve as a true relational database. Lookup fields in SharePoint lists are very difficult to work with in canvas apps, and performance with many lists in the same app will be slower than if you have a true relational database.
  2. Common formulas are not delegatable to Sharepoint: With canvas apps, heavy processing of data is delegated to the data connection. Common functions like if, Collect/ClearCollect are not delegatable in Sharepoint, while they are in other data sources like Dataverse. This means that if you need to return more than 500-2,000 records at a time, it is very difficult to do so with Sharepoint lists. With other connections like Microsoft Dataverse, Collect is delegatable, allowing for a much higher number of records.
  3. Security: if you share an app built on SharePoint with a user, you also need to share the underlying lists with them. This means if the user should not be able to see all of the records in the table (or update all of the records in the table), they could open up the underlying list and view or modify any of the records, even if you are obscuring the records inside of your app. Sure you can hide the list, but obscurity != security.
  4. SharePoint lists have limits in how they can be filtered in Power Apps: for example, filtering on yes/no fields does not work. Microsoft Dataverse based apps support complex filtering via views, which can include multiple tables.
  5. If your app exceeds the capabilities of SharePoint Lists, you need to start over—many apps begin as personal productivity but over time become more critical to an organization. Before you build your app, think through the potential roadmap—if there is a potential that it will increase in data, security sensitivity, or number of users in the future, build it on something other than SharePoint lists.

What is the alternative?

If you want to start with the benefits and ease of use of Microsoft lists without having to purchase premium licenses, start with Dataverse for Teams. Dataverse for Teams is included in the standard licensing for Office and Microsoft Teams.

In Microsoft Teams, click Apps and search for Power Apps, then click Open

Click New App

Select the name of the Team where you want the app to be used, then click Create

This will open the Power Apps Studio inside of Microsoft Teams.

Give your app a name

Click Create new table

Give your table a name

Define your columns in a Microsoft List like experience.

Dataverse for teams table creation has all of the things that are great about Microsoft Lists, without the major downsides.

  • It is included in the standard licensing
  • It’s easy to use to create and modify tables
  • It’s a real relational database—if you choose lookup for field type, it creates a real database relationship between the two tables
  • It supports more advanced delegation and filtering.
  • It supports real record ownership and security. You can specify different table permissions for users based on their role in the Team

It supports more advanced filtering via Dataverse views

  • If you exceed the capabilities of Dataverse for Teams you can upgrade your environment to a full Dataverse environment without having to rebuild your app.

Limitations of Dataverse for Teams

Dataverse for Teams environments are limited to 2 GB of data. For larger environments with more complex security requirements, I recommend Microsoft Dataverse. For a full comparison of Dataverse for Teams vs. Dataverse, see Dataverse for Teams vs. Dataverse – Power Apps | Microsoft Docs.

Thanks to Matt Devaney for contributing ideas to this tip.

Cover photo by Donald Giannatti on Unsplash.

Tip #1401: Round time in Power Automate

Today’s conversation:

- Hey, wanna see cool way to round time in Power Automate?
- Sure
- *shows the trick*
- Why it's not a tip yet?
- Mmmmm, dunno...

True story.

Sometimes we need to round time in Power Automate. For example, you want to schedule a meeting for tomorrow, at the same time as now. Except “now” could be 11:28 AM and everyone loves meetings scheduled for random times that do not align on 30 minutes.

The trick is to use the simple fact that there is no such thing as date/time data type in Power Automate, it’s all text in a prescribed format. This is the magic expression:


What does it do? It strips the minutes and seconds portion of the string by selecting just the date part and hours and then adds 00:00 as minutes:seconds (or you can add 30:00 if you want to be fancy).

‘Z’ is for ‘Ze UTC Time’, if you must know.

That’s it! Occam’s razor for the win.

Tip #1400: Excluding inactive records from duplicate detection

It’s good to be reminded every now and then about the features that have been around for a long time and are documented but somehow any question from the business about them draws a blank and ubiquitous “I have to get back to you”.

Today’s reminder is from Ankita Chavan. (Do you use a feature all the time? Email a reminder to jar@crmtipoftheday.com)

When we merge the two records, one record is a master record (active record) and other record gets deactivated but still present in the system.
In case you make changes to the active record, you will get below alert due to OOB duplicate detection rule set for the table (entities).

In case you do not want this alert, follow below steps – Unpublish the rule and mark “Exclude Inactive matching records” and Publish.

Official docs: Set up duplicate detection rules to keep your data clean – Power Platform | Microsoft Docs

Tip #1399: Dynamic ‘+’ email aliases in Office 365

We rarely publish tips outside of our cozy Power Platform bubble (more like a Zeppelin these days though!). But this one… THIS ONE… I’ve been dreaming about it since I’ve got my first email eons ago… It’s here now!

Support for Dynamic ‘+’ Email Aliases in Office 365 – Customer Feedback for Microsoft Office 365 (uservoice.com)

  1. Make sure you have your Exchange Online Powershell going
  2. Type the magic words
   -AllowPlusAddressInRecipients $true

From now on you can write george+one@crmtipoftheday.com on your party invitations.

Tip #1398: Set calculated field to null

Calculated fields columns are great for quick simple formulas. They are fast, instant, and always up to date. But what if you need to clear the value of the column as part of the calculation?

Customer has sites, sites have assets in various categories (hardware, software, etc). The other day I’ve had a requirement to show the total value for an asset category as $0.00 if the site had the assets but wasn’t live yet and to show — if site didn’t have assets in a specific category. Serving as a visual clue. Great idea and calculated field with a condition on status should do the job. Except that sometimes I need to set it to null.

Create a dummy column Null Decimal. Not searchable, optional, no decimal places, never visible, always null.

Now we can use this column to set the value of a calculated decimal or a currency column.

Cover image by Gerd Altmann from Pixabay

Tip #1397: Reduce extra Power Automate runs and overcome trigger limitations

You know I’m a sucker for some open source goodness. This time the goodness is from Aiden “President” Kaskela. (Got something to share? Send your wares to jar@crmtipoftheday.com).

The Common Data Service (Current Environment) trigger in Power Apps allows you to run your cloud flow when certain fields are updated, but there are two shortcomings:

  1. Your flow can trigger on multiple fields and you don’t know which one changed.
  2. In the Dataverse, fields may register as a change even if the value doesn’t (typically caused by updates through code where it’s including the current field value instead of just the changes).

Since you don’t know what fields changed, or if they’ve even changed at all, your flows may be running unnecessarily and doing a lot more work than they need to – and in a world with API limits, that’s something to be concerned about.

I build Power Automate Assistant to address these challenges, which not only tell you what fields changed but also cut down on the number of flow runs due to ghost updates. This tool is open source and available through the MIT license.

The Dataverse considers a field as having changed even if the value doesn’t actually change. We can’t control that. What we can do though, is create a new field and control when and how that is changed. By ensuring only true changes are tracked, we can bypass unnecessary flow triggers.

Here’s how it works:

In this example, we have a flow that runs when a Contact is created or updated, if the First Name or Email changes.

  1. Create a new text field on your entity (afk_ModifiedFields). In this case I’ll stick with 100 characters, but it should be as long as all the schema names for the fields you’re tracking.
  2. Install the Power Automate Assistant and configure it on the contact so it runs any time the First Name and Email changes.
  3. When those fields are updated, the tool will evaluate the field values to ensure it actually changed. If the fields have changed, it’ll update the new text field (afk_ModifiedFields) with the field names that changed.
  4. Configure your flow to run on afk_ModifiedFields.

What happens now?

With this setup, if you update the contact’s email address and save the record, your new text field will say “emailaddress1”. If the first name changed, it’ll say “firstname”. If both changed, it’ll say “emailaddress1,firstname”.

By setting your flow to run on that text field, we can ensure it only triggers if we have a real change, and on top of that, you can treat that new text field as an array and see what fields triggered the update and take actions accordingly.


Where do I get it?

Managed and unmanaged solutions and source code is available at https://github.com/akaskela/PowerAutomateAssistant

Step-by-step instructions in the wiki on Github https://github.com/akaskela/PowerAutomateAssistant/wiki/TrueFieldChange

Cover photo by Photoholgic on Unsplash

Tip #1396: Multi-currency aggregates in Dataverse


When using aggregates with multiple currencies make sure the business understands how the totals are calculated, how exchange rate is used, and why it’s a good idea to roll out your own calculations.

Bring it on

Multi-currency support has always been one of the staple features of …. eerrr … Dataverse since version … uhm … 4.0? And it’s a great one out of the box: enter the value, specify the currency, and kaboom, it’s converted on the fly using whatever the current exchange rate is, and saved in both base and record’s currencies.

Where it starts getting complicated when you consider changes over the time and aggregates. Time component affects the calculations because the exchange rate changes. Aggregates throw in another spanner because you can’t simply add Icelandic Krónas and Netherlands Antillean Guilders.

Let’s say we have sites around the globe with some equipment purchased and assigned. Our base currency is USD. Site total are calculated using rollup fields.

Site record for Sydney, Australia (not Sydney, Nova Scotia) as of 1 July 2020:

Exchange rate0.6895
NamePriceFXTotal USD
5m cable$300.6895$21

How Dataverse calculates the totals:

  • Take the price values entered by the user ($50 and $30), convert to the base currency ($34 and $21 USD)
  • Calculate the total using base currency ($55)
  • Convert back to $80 in the record’s currency (AUD) using FX rate of 0.6895

That was easy, right? Skip forward to November 2.

Exchange rate0.7
NameTotalFXTotal USD
5m cable$300.6895$21

Same data but we just lost $1 because the exchange rate went up to 0.7. Individual items didn’t recalculate as we haven’t change anything.

Let’s keep going. On the same date we added another item:

Exchange rate0.7
NameTotalFXTotal USD
5m cable$300.6895$21

Note that the exchange rate for the last item is different. Roll forward to today:

Exchange rate0.7628
NameTotalFXTotal USD
5m cable$300.6895$21

And we seemingly lost money again because our peso keeps climbing.

While math is correct, it does not feel right. We purchased items in AUD so the total shouldn’t change. Right? The answer is “it depends” on how your business records the value of the assets, how the depreciation is calculated, and if the moon is in the house of Saturn. We’re not even taking into account when the individual lines will recalculate – it’s a separate topic.

The most common localized style of recording is to calculate the aggregate value in the local currency and then convert to USD. Total line should look like this:


Things can get really heavy if individual items can indeed be in the different currencies. You’d need to decide if you want to convert the items to the base or local currency when doing the calculations.

That makes more sense but it’s not possible for Dataverse to perform this mental gymnastics. Solution? Do it yourself. It wouldn’t be too difficult to write a cloud flow that would iterate over the assets and calculate the total value the way the business wants. Needs to be real-time? Summon a developer and get them to write a plugin.

Bottom line

Multi-currency aggregates rarely work the way the business wants out-of-the-box, handle them with care.

  • Remember that any OOB aggregates are done in the base currency.
  • Understand the individual records recalculate only when they are saved and they will use the current exchange rate (stored in the currency settings). That may or may not be what the business wants or expects.
  • Explain the calculations to the business and confirm that is (or is NOT) expected and/or correct.
  • Decide if the exchange rate should be updated on a regular basis using some external FX service OR if the business would prefer to have a fixed exchange rate and adjust it as required on an infrequent but regular basis (not an uncommon accounting approach).
  • Apply UCS™ methodology[1] to figure out if custom calculations make sense. Most of the time they do.

[1] Use Common Sense