Tip #975: What is “user activity tracking” in Gamification

One of the new KPI’s introduced in the May 2017 update for Dynamics 365 Gamification is “user activity tracking.”

According to the documentation, this KPI ” award(s) points based on a user’s activity in Dynamics 365.” This “circular definition” is not extremely helpful, as it doesn’t define the types of user activity that give users points in a game.

Ahmend Hudda from Microsoft provides some additional clarification:

User Activity Tracking KPI refers to looking at audit log activity around CRM Sign In activity for the user.  It checks user sign in based on the time interval you setup in the KPI (e.g. 12 hours or 24 hours) and sends that data over to Gamification for active games using that particular KPI.  If you setup the KPI with say a 12 hour window, it will check whether a user has logged in at least once in the last 12 hours.

The goal here is to use Gamification to help companies with active Dynamics 365 licenses get the most out of their investment by encouraging steady user interaction with the product.

This means that if you count total logins for a user in audit logs, your number may be different that the number awarded in Gamification, but the approach that Microsoft has taken is a good balanced approach. it weeds out some of the noise in the audit logs, without reflecting inflated totals if users log in multiple times per day or connect to CRM for Outlook without really using the system. And by counting all logins within an interval as one user activity, it eliminates users gaming the system by logging in and out multiple times.

I’m glad to see that Microsoft has given us an easy to use way to see who is using the system.

Share on FacebookTweet about this on TwitterShare on Google+

Tip #974: Attachments vs. Documents

We have posted here about attachments, and we’ve posted about documents, and whether you should extract your attachments to Azure or SharePoint. But in these discussions, users can get confused, so in this post, I’m hoping to add some clarity to this topic.

In my opinion, attachments and document management are two different use cases. While they can overlap, they address two distinct scenarios:


Attachments are file artifacts that will accumulate as you use Dynamics 365. Some will come from emails that are tracked; some may be photos that you take while on site at a client location, some may be news clippings related to a customer. These are unstructured data, are infrequently accessed, and only needed by people working inside of Dynamics. They do not require revision controls.


Documents are more important files and typically are related to a specific process, such as a sales process. There may be legal reasons you need to store them, and you need to be able to quickly find a specific document. They may be things like sales proposals, quote documents, or contracts, which may go through multiple revisions, and you need to be able to view the history of the document. Access to these documents is frequently required by people who are not Dynamics users.

Why is this an important distinction?

In the cloud world, discussions around documents and attachments are frequently driven by storage cost. And while that is important, it overlooks the more important factor of user experience. In other words, just because you strip your attachments to Azure in a supported manner doesn’t mean that you don’t still need the SharePoint document integration.

And I’ve had many clients that have SharePoint document integration tell me “we don’t use attachments in CRM,” only to find that they have 10-20 GB of attachments in their database from email attachments. Don’t expect users to put unstructured attachments in the documents area, and don’t try to force users to try to manage documents with attachments. Use the right tool for the job.

The right tool for managing sales proposal documents is SharePoint. The right tool for quickly uploading photos from your phone while on a site visit is attachments (and then extracting them to Azure BLOB).

Share on FacebookTweet about this on TwitterShare on Google+

Tip #973: Use Trace not Console

For a long time I’ve been a big fan of console applications. Great way to put something together quickly, to see if the idea works, or simply test a piece of code or yet another library. What’s the way to see the output when you need some? They teach you that in code schools:

Console.WriteLine("Hello, world");

I was wrong and I apologize. In fact, I declare use of Console for the purpose of logging or test output to be a case of spießrutenlaufen and banished forever. Use Trace instead:

Trace.WriteLine("Hello, world");

There are multiple advantages: you can set and use levels, boolean switches are available, there are handy WriteIf and WriteLineIf functions, to name but a few. Oh, and the word itself is shorter. The most important, of course, is the concept of listeners, i.e. it’s totally up to you where to direct the output. It can be done in config file

    <trace autoflush="false" indentsize="4">  
        <add name="fileListener" 
 initializeData="TextWriterOutput.log" />  
        <remove name="Default" />  

or straight in code

var logger = new ConsoleTraceListener();

Concept of listeners really shines when you need interaction with other libraries that also make use of tracing. Take for example, XrmTooling. If you use CrmServiceClient in your application, but something is not working and troubleshooting is required, simply do this before initializing CrmServiceClient:

var logger = new ConsoleTraceListener();

TraceControlSettings.TraceLevel = SourceLevels.All;

var client = new CrmServiceClient(...);

The output will light up with all sort of goodness including interesting performance bits like timing of the queries.

When using Console, output is not guaranteed either. If you try running your console code as an Azure Function on Consumption Plan (where mighty Kudu console is not avialable), good luck fishing where did Console.WriteLine go. Even though tracing in Azure Functions is different and not based on System.Diagnostics classes, it’s easy to whip your own tiny listeners as we did in Tip 808:

public class TraceWriterWrapper: TraceListener
  private TraceWriter _log;
  public TraceWriterWrapper(TraceWriter logger) 
         : base(name)
    _log = logger;
  public override void Write(string message)
  public override void WriteLine(
            string message)
public static void Run(string input, TraceWriter log)
  var logger = new TraceWriterWrapper(log);

and all of your trace output will reappear in the function log.

Of course, there is always Console.Beep() if you need a sound.

Share on FacebookTweet about this on TwitterShare on Google+

Tip #972: Data import with date field fails

Recently we were struggling with import of a spreadsheet into Dynamics 365. If the date field was included, the import would fail. The user doing the import was set to US format, but the date format was set to dd/mm/yyyy format.

We found that we could get past the issue by saving the import spreadsheet as a CSV file. After saving as CSV, the file could be successfully imported, but it could not in XLSX format, even after reformatting the date column.

I’ve noticed that others in the Dynamics Forums have reported issues importing dates via the import utility from xlsx files, but that saving in CSV format works. You may also have success with Wayne Walton’s suggestion to format the dates in the spreadsheet in ISO standard YYYY/MM/DD format.



Share on FacebookTweet about this on TwitterShare on Google+

Tip #971: Be careful when using unsupported solutions

When the wheels are falling offIt does not happen often but it does happen. After some retrospection, I urge everyone to be very careful when following Joel’s advice on using attachment extractor solution. (Some say one needs to be careful when following any Joel’s advice but that’d be a discussion for another day).

My arguments (both of them):

  1. It is not supported.
  2. It does not come with the source code meaning you cannot support it yourself either.

Unless either of these change, i.e. Microsoft Labs folks have a change of heart and release it as an open source, or offer some kind of support, you’re risking of being seriously stuck if something goes wrong. (The only reasonable option is to call Joel, his direct number is 555-365-HELL). The solution does not move existing files and we cannot enhance it without going through some shady activities of reverse engineering the code.

What are the options?

  1. Third-party solutions. Paid and supported.
  2. Open source. One of the upsides is availability for on-premises. Not that on-premises storage is expensive or scarce but it’d be an interesting option to minimise the SQL consumption prior to the migration to online.
Share on FacebookTweet about this on TwitterShare on Google+

Tip #970: When attribute and entity collide

This tip is from Martin Tölk.

If the name of an attribute on a custom entity matches the name of the entity itself then you will not find that attribute in the metadata returned by WebAPI. Dynamics 365 will play the games with your brain by adding ‘1’ to the attribute name, presumably to avoid clashes.

For example, if you define an entity:

Entity name

and include an attribute with the same name (we use primary attribute, the same as Martin did, but the problem is reproducible on any attribute):

Attribute name

and then attempt to get the data using WebAPI: https://org.crm.dynamics.com/api/data/v8.2/new_locationcodes?$select=new_locationcode, you will get an error “Could not find a property named ‘new_locationcode'”.

Try getting all attributes by removing $select (and adding $top to keep it short): https://org.crm.dynamics.com/api/data/v8.2/new_locationcodes?$top=1, and you’ll find that the attribute was magically renamed to new_locationcode1:


Ugh. The easiest way to avoid this headache is to avoid naming clashes in the first place.

Be smart, be like Martin and send us your tips via Facebook Messenger or simply email jar@crmtipoftheday.com.

Share on FacebookTweet about this on TwitterShare on Google+

Tip #969: The problem with sharing

Does Microsoft have guidance about how much sharing is too much? — CRMTOD reader

I find it hard to believe we have gone 968 tips without talking about the risks of excessive sharing in Dynamics 365. Few features in CRM parallel sharing on the “this is the best thing ever/this is my worst nightmare” scale.

How record based security can be used to control access to records in Microsoft Dynamics 365.

The good

Sharing is a good feature, because it gives administrators and users with the appropriate permission the ability to grant specific permissions to specific records, and is useful for handling exceptions to the rule. Need to have salesperson 2 handle salesperson 1’s accounts while she is out for a month filming Survivor season 83? Sharing can do that. Sharing can also be automated, meaning that if you have a need for a specific condition to automatically share records with a user or team, simple plugins, workflow assemblies, or Scribe can be used to do that. This has been the answer to many Dynamics clients funky security requirements.

The bad

While a very useful feature, sharing has a dark side.

  • Performance: sharing is facilitated by the Principal Object Access (POA) table. When you share a record with a user or team, a record is created in the POA table containing the ID of the user, the ID of the record, and the permission that he or she should have. But that’s not all! The cascading nature of sharing means that if a parental or configurable cascading relationship exists that is sharing enabled, the child records in those relationships will also be shared with the user or team (and additional records will be added to POA). There is also a bunch of murky reparent/inherited share scenarios that can also create records, which can cause the POA table to grow quickly.  This becomes a performance issue when the table gets extremely large (somewhere between 20 million and 2 billion records). When you query CRM, such as opening a view, advanced find, or viewing a chart, the results are filtered by the POA table. If the table is quite large or indexes are not optimized, this can lead to very slow system performance.
  • Administrator’s nightmare: Quick–show me which records are shared with Bob. You can’t do it. While you can click on a record and show who it is directly shared with, there is no way to easily do that for all records. Also, cascading/inherited shares don’t show in the sharing dialog on the record. Without opening each record in the context of that user, it is virtually iimpossible to know if your sharing strategy is working correctly
  • Forgotten shares: Remember that sales rep you shared the records with while her buddy was off for a month? Odds are you will forget to unshare those records. Got a workflow that automatically shares record with Tim if they are in Seskatchewan and the plumbing industry? Well Tim moved to a different industry vertical last month. Did your workflow remember to unshare all of those records?
  • Can’t “make it right”: After you use the system for a year or two, you may find that things get a bit off or you decide to make a wholesale change to your sharing strategy, so you want to run a “make it right” batch job to set/update all records with the appropriate sharing permission. There is no easy way to do this with sharing.

The answer

So to answer the question, yes, Microsoft does offer some guidance (if you are on premises) to optimizing and controlling growth of your POA table. But probably the best guide to understanding the POA table comes from this classic post from Scott Sewell. In it he explains how to decode the structure of the POA table and understand how your sharing strategy will impact database size. He also offers a Excel based decoder tool to encode/decode the POA table. Unfortunately the link in that post to the secret decoder ring is no longer valid, but Scott has located that file and you can download it here.

So now that you understand how the POA works, what are some steps that you can take to avoid the dark side of sharing?

  • Team ownership: In the old days, teams couldn’t own records, so we had to share to grant multiple people access to a record, without granting access to the entire business unit. With team ownership, you can assign records to teams of users in multiple business units.
  • Share with teams, not users. If you share a record with 10 users, 10 POA records are created, X 10 POA records for each child cascading shared record. If you share the record with a team with ten users, only one POA record is created (along with 1 POA record for each cascading share). This will dramatically reduce the size of your POA table. Want to take away a user’s permissions? Remove them from the team.
  • Use access teams for controlled sharing. So you can’t do owner teams, but you still want to be able to grant ad-hoc access to records to specific users. Some you want to just read the record, some you want read/write. Access teams can handle that, and you can have multiple access team profiles, one for read, one for read/write. Access teams are designed with performance in mind, so they don’t actually create the team and share the record until you add the first member of the team.

The real beauty of the team approach, be it owner or access team, is that it makes it much easier to see what records a user has, just by seeing what teams he or she is a member of. If you want to run a “make it right” batch job to reset all sharing permissions, you can do that by wiping out your team membership and then running a SSIS or Scribe job to populate the teams based on the new logic.

So please share your toys and Dynamics record access, but do it wisely.

Share on FacebookTweet about this on TwitterShare on Google+

Tip #968: Should I modify system reports?

There are 26 system SSRS reports (not counting subreports) that come with Dynamics 365. If you find one that is close to what you need (like the quote report), should you modify the system report?

I say no. Here is why:

  • System reports currently in Dynamics 365 are the same system reports that were in Dynamics CRM 3.0 (for the most part). They were written by someone who is probably no longer at Microsoft. When you dig into the RDL you will find that the most basic system reports include a lot of overhead and things like system reporting parameters that are not documented in the SDK.
  • Even though the system reports have not changed much over the years, upgrades will sometimes overwrite/republish system reports. So if you modify the system reports, there is a possibility that a future upgrade could overwrite your changes.
  • If you are online, you will find that some system reports still use T-SQL queries, while that is not allowed for us mortals. Microsoft sometimes gives themselves special dispensation to to things that we can’t. So if you want to copy the report and modify it, you will likely need to convert your copy to FetchXML.

For these reasons, other than minor/cosmetic changes to reports, my recommendation is do not modify the system reports. A good SSRS report writer can recreate the system report faster than she or he will be able to decode what somebody who is no longer at Microsoft was doing 10 years ago.

Share on FacebookTweet about this on TwitterShare on Google+

Tip #967: UI testing for Dynamics 365

Software testing is important and Dynamics 365 is no exception. Fundamentally, developing for Dynamics 365 is different from developing, say, an ASP.NET MVC application but, users don’t really care, do they? It’s a software that hopefully delivers business value, deal with it.

Developers who recognize the importance of the process, have always been making inroads into unit testing of the code they create, be it for the plugins to deliver magic on the server or javascript for the client-side functionality. Plenty of frameworks to choose from. For the C#, some developers prefer Moqs, some like Microsof Fakes, some even created CRM-specific agnostic frameworks. For javascript zealots, QUnit can be used for browser-based testing, something much more elaborate to step outside the browser, or, more recently, using Jasmine.

That’s all great but, technically, users don’t really care because with all these frameworks we’re not testing user experience.

Great news though, we now have a preview release of the UI Automation Library for Dynamics 365 CE and CRM. It comes from the Dynamics 365 team at Microsoft and is based on Selenium. I’m just going to copy their description verbatim because it sums up the purpose and the functionality really well:

The purpose of this library is to provide Dynamics customers the ability to facilitate automated UI testing for their projects. These API’s provide an easy to use set of commands that make setting up UI testing quick and easy. The functionality provided covers the core CRM commands that end users would perform on a typical workday and working to extend that coverage to more functionality.

This is great news and, in fact, it looks like the last missing piece of the fully automated testing pipeline.

Share on FacebookTweet about this on TwitterShare on Google+

Tip #966: E-mail integration in team or department deployment

MailboxesJoel has been producing tips by a truckload, I don’t think he’ll notice if I sneak this one in, especially when a fellow comrade developer David “Xrm.Tools” Yack is in pain.


Anyone have any suggestions for where let’s say a Team/Department gets CRM in their own subscription but their e-mail is still managed by a their corporate email server totally outside of the CRM subscription and their control.

They would like to have some of the goodness of e-mail integration with Dynamics 365, but have no ability to influence the corporate email strategy.

Any creative suggestions?


I have done that exact scenario with a forward mailbox. If the forward mailbox is on the domain and the email address of the forwarded email matches the address on a queue or user, the emails can be created in CRM.

Tîpp Jäår

Joel has written about forward mailbox many moons ago, worth another read.

Share on FacebookTweet about this on TwitterShare on Google+