Tip #1272: A better way to notify

I love my D365 email notifications

said nobody

Recently I spoke at a conference with Dynamics users, and I asked the group of about 50 people who liked their CRM email notifications. Nobody raised their hand.

There are several problems with traditional workflow based email notifications:

  1. Everybody gets too many emails.
  2. Blanket workflow based emails will include some notifications that I don’t care about.
  3. System generated email messages that I can’t control diminish my sense of control.
  4. Users, getting overwhelmed with too many notifications, will ignore or delete them

Alternative options

Fortunately, there are better ways to notify in Dynamics 365. One option is don’t notify and teach your users how to use views and dashboards to see records that they need to be aware of. You can also use activity feeds and set up automatic posts via workflow.

Another approach (currently in preview) is to use relationship assistant and use Flow to create new relationship assistant cards.

These are all great options, but in this post I’m going to show you an approach that I’ve used that lets the user decide what he or she will be notified about.

Notification Subscription Center

In this approach we give users choice in what they are notified about and the way in which they are notified. Want to try this solution out? Download a copy at the TDG Power Platform Bank.

Create the subscription center CDS entity

Create a custom entity called Subscription center. You will need the following components:

  • Standard owner field (who should be notified)
  • Notification type — the way the user will be notified. I chose option set because while initially I just have email and SMS options, in the future I may add additional types of notifications, such as mobile push.
  • N:1 relationship with whatever entities you wish to enable notification subscriptions. You do not have to display the lookup fields for these entities on the form if users will be creating notification subscriptions from the parent record.

Now users can “subscribe” to any record for notification and specify what type of notification that they wish to receive.

Creating the Flow

Now when you want to create a notification, such as notify when an opportunity is won, you would create a flow.

  1. Create a flow that runs on change of the opportunity status reason field.
  2. Check to see if opportunity is won.

3. List subscription center records related to the opportunity where subscription type = email.


4. List subscription center records related to the opportunity where subscription type equals text.

5. Insert a condition step and using the length expression, evaluate if the results of the email list records step has at least one record. If it does, do an apply to each with the value of the email subscription list records step and send an email to them. You will need to get the user record of the owner to get the email address.


6. Do the same thing for the SMS subscribers, sending them a text message via Twilio.

Now users can choose what they get notified about and how they are notified.

Bonus tip

Check out this Flow template to see how to send a weekly digest email of notifications.

Cover image by Daderot [Public domain], via Wikimedia Commons

Tip #1271: Where is my tenant ID?

Every now and then someone from Microsoft may ask you for your tenant ID. In my case it was in relationship to one of the preview programs at experience.dynamics.com insider program.

But what is the tenant ID? First, what it is not:

  • It’s not your D365 environment URL
  • It’s not your D365 evironment ID found in settings/customization/developer resources

It’s your O365/Azure tenant id. This can be confusing because some people (even some Microsofties) have been known to use the terms environment/organization/tenant interchangeably. Remember the excitement around the introduction of “multi-tenant” CRM a few years back? That should have been multi-environment. Learn the lingo:

Environment: a separate instance of D365/CDS. Think dev environment, prod environment, any number of CDS environments you may use with Flow and PowerApps.

Tenant: Your Azure instance–this is what is tied to your domain, and you can only have one tenant tied to a domain. All of your AD users and all of your Azure, Office, and Dynamics subscriptions are part of this tenant. See
https://docs.microsoft.com/en-us/office365/enterprise/subscriptions-licenses-accounts-and-tenants-for-microsoft-cloud-offerings .

So how do I find my tenant ID?

There are multiple places this can be found, but this is the most direct IMO:

  1. Go to portal.azure.com
  2. Click on Azure Active Directory
  3. Click Properties under the Manage section
  4. Your tenant is the Directory ID. Azure gives you a button to push to copy the value

Cover photo by unsplash-logoAlex Block

Tip #1270: Table or view is not full-text indexed

Today’s tip is from Marius Agur Hagelund “Viking” Lind (actually, I’m confused, perhaps it’s Marius “Viking” Agur Hagelund Lind?). Got a tip of your own? Send it to jar@crmtipoftheday.com.

Cannot use CONTAINS or FREETEXT predicate on table or indexed view because it is not full-text indexed

Mean SQL Server

If you’ve ever got this error message it’s probably because you tried searching using the SDK using the contains keyword on a field which isn’t full-text indexed.

For me it was searching for systemusers:

nameSearch.Criteria.AddCondition("firstname",
   ConditionOperator.Contains, 
   searchString);

I got the following nice error message in return, which puzzled me at first

Cannot use a CONTAINS or FREETEXT predicate on table
or indexed view 'SystemUserBase' because it is not full-text indexed.

I tried checking in the system, and found that I could search for names there when I used wildcard characters, and then it dawned on me that all these years of autocomplete, intellisense and helpers have made my lazy and dumb. Using wildcards is not the same as using CONTAINS, which is very obvious if you take a SQL Server 101 course found anywhere, so the solution was as easy as this:

nameSearch.Criteria.AddCondition("firstname",
   ConditionOperator.Like,
   $"%{searchString}%");

But what about the web api? Well, turns out they removed the Microsoft.Dynamics.Crm.Contains action and only use the Contains keyword (api/data/v9.1/systemusers?$contains(firstname, ‘mike d’). That is, unless you want to perform a full-text search in knowledge base articles:

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/web-api/fulltextsearchknowledgearticle?view=dynamics-ce-odata-9

So lesson learned: Stop being a dinosaur and start using the web api.

Viking out.

Cover photo by unsplash-logoDaiga Ellaby

Tip #1269: Oh crap, I lost my app

So you decide to “refresh” your development environment with a copy of your production environment, but too late you discover that copying an environment over another one makes canvas apps and flows created in that environment go away.

What should you do?

Other than kicking yourself for not remembering to back up your app and flow definition, don’t do anything.

When you refresh a D365/CDS environment with a copy of another environment, the flows and canvas apps that were in the environment go away, but within a short period of time you will see a restored copy of the app and flows appear in your environment. The canvas apps will be appended with the word “restored” and the date. Flows that are restored will be turned off.

You will need to share the app again with whichever users should have access to it again.

Tip #1268: Restrict CDS instance creation

With P2 licensing, can you control who can spin up a CDS instance via Azure Active Directory since each license comes with 2 CDS instances? (we don’t want hundreds of CDS instances cluttering up our tenant)

The question from an enterprise business size UG member, generously relayed to us by Jerry “Forever Tipster” Weinstock

Via set-TenantSettings example towards end of the post

David F Yack

tl;dr

How to govern environment creation

Download and install the admin powershell cmdlets as described here. Read more about our cmdlets here.

$settings = @{ DisableEnvironmentCreationByNonAdminUsers = $true }
Set-TenantSettings $settings

Note for PowerApps/Flow customers – If you use the new flag to restrict the environment creation, only tenant admin will have the ability to create new environments.

Personally, I like the suggestion that the default should be Opt-in not Opt-out. Or, as one of the commentators put it succinctly:

How quickly can I change that setting to false

Cover photo by unsplash-logoAlice Achterhof

Tip #1267: Do PowerApps work with Dynamics 365 on premise?

The answer is “it depends.”

It depends on how you define “works with.”

There is no PowerApps connector for D365 on Premise. There is a SQL Server connector, and this connector can connect to on premises SQL databases. This includes the D365 on premise SQL database. So you can make an app that reads directly from your on premise CRM database, but this connector cannot update or create records in a supported manner. And as you long time tip of the day readers know, we don’t recommend doing unsupported stuffunless George says it’s ok.Another issue with this approach is it is likely to be slow–it’s very difficult to get good performance reading from on premises systems from the cloud, and you have to open up external access to your SQL server. Just don’t do this.

The second option is to integrate your on premise environment with the CDS–in effect setting up a hybrid environment where you have a copy of your configuration in the cloud as well as your on premise CRM, and create a bi-directional integration to synchronize data changes between the two environments. This is probably the best option as it would recognize your existing security and record ownership and provide full CRUD capabilities. But this option also carries some potential overhead–you will need to reflect configuration changes in both (at least for the entities that are included in your PowerApps), and there will be potential delay for record changes to synchronize between the two — If salesperson 1 updates a contact phone number in D365 on premise and contact 2 saves the contact on her PowerApp, this could lead to some interesting data conflicts. There will also be licensing implications — your users will need to have at least a P1 license in the cloud to be able to use PowerApps that use the Common Data Service

A third option is to synchronize your on premise CRM data with another type of cloud storage — could be an Azure data warehouse, SQL database, data lake, or any number of other places. These are viable connections for your PowerApp. The problem (like # 1) is with creates, edits, and deletes, as well as reads if security restrictions in D365 on premise need to be honored. If I want a Power BI dashboard or PowerApp that simply display any record from my CRM on premise system, I can replicate the on premise data to Azure and go to town. However, if users create or modify records from the PowerApp or need to be restricted in viewing records following the same logic that CRM on premise security roles dictate, this is not a great option, and you may inadvertently open the proverbial barn door.

My recommendation

Don’t do this. Go all the way to the cloud, not part way. Your Power Platform experience will always be more satisfying if you are using Common Data Service and Dynamics 365.

But wait, you say, we have a bunch of reports and system integrations that we can’t quickly upgrade, yet we want to give our users the value of the Power Platform now. I get that and totally understand that if you have been developing on CRM on premise for years, it takes a while to get some of those components cloud ready.

Maybe you should take option 2 and flip it on its head–instead of making the assumption that all of the users will keep using D365 on premise, what if all the users moved to the cloud, yet you kept D365 on Premise around short term for reporting purposes? That way the users could start benefiting from the cloud and directly connected PowerApps and Flows now, but keep on premise around as a read only replicated reporting environment (while you work to move those reports and integrations to a cloud ready approach (like PowerBI). This would minimize the overhead of the approach, as the majority of the users would simply access D365 online.

Summary

Avoid this if at all possible — go all the way to the cloud. If, for some reason, you need to do this short term, I recommend that you replicate your on premise data with the Common Data Service to enforce appropriate security and make your future cloud migration easier.

Cover photo by unsplash-logoElias Schupmann

Tip #1266: Sign out of Skype when forced to Teams

Anyone have any idea how to log out of Skype when your only option is to “Go to Teams”?

Daryl “Always Raising” LaBar

Why would you need to do that? This is my story. I had two Skype accounts: one from my company and another one from, uhm, a partner. I was signed into Skype as my own account and, after we got upgraded to Teams, I would face “Great news – go to Teams” message every time I start Skype. Didn’t give it a second thought. Until, one day, I received an invitation from the partner to join a Skype call and I had to do that using my partner login. So I needed to sign out of Skype which was not possible because it would kick me straight over to Teams. Which is exactly what Daryl was facing.

Oh man, I wish I could remember exactly the magic combo I used to accomplish that (Skype stuck on the team account and I needed the other one). I used the command line switches plus temporarily blocking sign in url (like disconnect your network adapter).

The Enabler

THAT’S IT!
Go to airplane mode, and when it says trying to sign in, you can cancel

Daryl

Moral of the story: when everything is going down the drain, pretend you are offline.

Cover photo by unsplash-logoPeter Pryharski

Tip #1265: Dynamics 365 for Marketing prevents solution imports

Some Dynamics 365 administrators have found that installing Dynamics 365 for Marketing in their dev environment prevents them from moving their configuration changes to production.

Trigger warning–licensing talk ahead (but trust me, it will be worth it)

To understand why this is, you first have to understand how the marketing solution is licensed. Unlike almost every other Microsoft solution, D365 for marketing is licensed by the environment, not by user.

D365 for marketing is priced based on the number of contacts included in marketing activities over a 12 month period in an environment. This makes the pricing scale based on the amount of marketing that you do (and is very similar to other marketing automation providers), and each environment needs to be licensed for Marketing.

The problem comes when trying to move solutions if the Marketing app is not provisioned in the target environment. Dynamics 365 for Marketing adds dependencies to the appointment and user entities, so you will not be able to move these entities in an unmanaged solution from an environment with D365 for Marketing to an environment without D365 for Marketing.

See the official documentation for how Marketing apps are added.

But wait–we have marketing included in our license

You may have received a free marketing app with your Dynamics 365 Customer Engagement license–congratulations. Just understand that this doesn’t mean you have a license for all instances. When you provision that instance, if you choose to put it in your dev environment, you will also need to license it for production.

See the FAQ about the “free instance.”

Recommendation

If  you receive the “free” marketing app but you aren’t 100% sure if you are going to use D365 for Marketing in production, don’t provision the free app against your primary development or UAT environments–you can now have as many CDS/D365 environments as you want, so if you just want to stick your toe in the marketing pool before you are fully committed to it, spin up a new environment and install your free app there. That will avoid creating unwanted dependencies.

By setting up a separate dev sandbox for marketing and not installing marketing in your core dev environments, you will avoid adding dependencies for Marketing to your core configuration.

Finally, if you find yourself with configuration dependencies because Marketing is installed in dev and not in downstream environments, you will need to manually remove the lookup fields and navigation links from the entities with dependencies before you can move your solutions. Alternatively, moving your configuration changes in a managed solution may work (but then you need to be prepared for the ramifications of managed solutions).

Cover photo by unsplash-logomarc liu

Tip #1264: Subscribe to AD changes using Flow

My flowbies! I want to trigger a Flow based on someone being added to an Azure AD Group. This doesn’t appear to be possible currently, as the Azure AD connector has no triggers. Am I correct?

Andrew Bibby

Hold my beer

Microsoft Graph API contains the subscription feature where you can create subscriptions where a listener application receives notifications when the changes occur in the specified resource.

The process involves the following steps:

  1. Create the notification endpoint Flow
  2. Create an app in Azure AD
  3. Create a subscription

Let’s dig in.

Notification endpoint Flow

Creating a subscription requires a notification endpoint that must satisfy certain validation requirements, namely return a validationToken passed as a query parameter.

  1. Use HTTP request as a trigger. When the flow is saved, the trigger will contain a unique URL that we will need later.
  2. validationToken expression is triggerOutputs()?[‘queries’]?[‘validationToken’]. queries gives us access to the query string and then straight to the validationToken.
  3. When the token is passed as a query parameter, we are in the validation stage, actual notifications won’t have the parameter. So here we split our execution.
  4. We are asked to validate. As per requirements, return the token value in the plain text body. Flow takes care of all the required decoding.
  5. We are receiving a notification. For now we simply quickly return 202 response (Accepted). (If Microsoft Graph does not receive a 2xx code, it will retry the notification).

App in Azure AD

Creating app in Azure AD is very straightforward – just follow the documentation. Since we are subscribing to the group, we need to add Group.ReadAll permission for Graph API.

Subscriber Flow

I wish I could claim the technique of creating a subscription using Flow but the formidable John “Flow Ninja” Liu described the technique over a year ago :O. Just follow the steps and you’ll be all set. For a change, I decided to use Postman.

The easiest way to deal with authentication is to create a collection and set all requests within the collection to inherit the authentication token.

You’ll find all of the parameters in the app properties in Azure AD. And yes, callback URL does not really matter here but it’s required.

Once Postman has a token, sending request to subscribe to groups changes is a breeze:

You need to use Flow URL from step 1 as a notificationUrl, and set expirationDateTime to something in the future but not too far (less than 3 days). Note that times are in UTC.

Testing

After adding a user to one of the groups in Azure AD, you’ll see two
(hopefully) successful runs for the notification Flow. First one is a validation run (you can drill into it and check the validation token that was passed in). Second one is the real McCoy containing the following data in HTTP request body:

[
  {
    "changeType": "updated",
    "clientState": "MaSekreet",
    "resource": "Groups/deadbeef-dead-beef-dead-beef00000075",
    "resourceData": {
      "@odata.type": "#Microsoft.Graph.Group",
      "@odata.id": "Groups/deadbeef-dead-beef-dead-beef00000075",
      "id": "deadbeef-dead-beef-dead-beef00000076",
      "organizationId": "deadbeef-dead-beef-dead-beef00000077",
      "eventTime": "2019-05-07T11:08:15.4245258Z",
      "sequenceNumber": 636928240954245200,
      "members@delta": [
        {
          "id": "deadbeef-dead-beef-dead-beef00000088"
        }
      ]
    },
    "subscriptionExpirationDateTime": "2019-05-07T15:37:48+00:00",
    "subscriptionId": "deadbeef-dead-beef-dead-beef00000069",
    "tenantId": "deadbeef-dead-beef-dead-beef00000096"
  },
  {
    "changeType": "updated",
    "clientState": "MaSekreet",
    "resource": "Groups/deadbeef-dead-beef-dead-beef00000075",
    "resourceData": {
      "@odata.type": "#Microsoft.Graph.Group",
      "@odata.id": "Groups/deadbeef-dead-beef-dead-beef00000075",
      "id": "deadbeef-dead-beef-dead-beef00000076",
      "organizationId": "deadbeef-dead-beef-dead-beef00000077",
      "eventTime": "2019-05-07T11:08:15.4245258Z",
      "sequenceNumber": 636928240954245200
    },
    "subscriptionExpirationDateTime": "2019-05-07T15:37:48+00:00",
    "subscriptionId": "deadbeef-dead-beef-dead-beef00000069",
    "tenantId": "deadbeef-dead-beef-dead-beef00000096"
  }
]

That’s a lot to digest but members@delta is the magic data that tells us that a user has been added to the group. I’ll save digesting this json for another day.

Here you go, Andy!

Cover photo by unsplash-logoLance Anderson

Tip #1263: Microsoft Flow and Azure outages

Most of you were probably impacted by a most recent Azure outage on May 2 (see https://azure.microsoft.com/en-au/status/history/).

Between 19:29 and 22:35 UTC on 02 May 2019, customers may have experienced connectivity issues with Microsoft cloud services including Azure, Microsoft 365, Dynamics 365 and Azure DevOps. Most services were recovered by 21:40 UTC with the remaining recovered by 22:35 UTC.

Azure status page writer

One of the questions that came up in the aftermath was what happens to the automated and recurrent flows that have either triggers or actions impacted by an outage? (thank you, Jerry, for asking).

I can’t think of a better person to answer this than Stephen Siciliano, a Principal PM Director for Microsoft Flow.

For flows we can divide the impact to triggers and actions. For triggers:

  • For automated flows that poll these flows would have failed to start new runs during the interruption. However, the way that polling triggers work is they check for new data every-so-often, this means they automatically “heal” when the system is healthy — they would simply process all of the events in the window since they last successfully ran, albeit significantly delayed. For webhook triggered automated flows, those events would have to be resent.
  • For instant flows triggers (e.g. flows manually started by users on-demand) they would have immediately received an error upon trying to trigger the flow. Each user will need to retry running the flow. Since the triggering itself failed, there is no way for an admin to ‘resubmit’ this failure.
  • For scheduled flow triggers, there may have been intervals that were skipped. These flows automatically resume upon the system healing.

For actions it would be possible for a flow to have failed in mid-execution if it had previously been triggered but the actions begin failing. Flow makers may want to Resubmit failed flow runs. A maker can see the failures across their flows by going to the Alerts icon at the top of the Flow portal and selecting the flows runs which failed (they don’t need to inspect each flow individually).

And for all nay-sayers out there I can’t think of a better way to express my attitude towards what happened in Azure: