Tip #1149: Create custom schedule for your flows

Today’s tip is from Marius “flow like a river” Lind. (And you can also become a tipster by sending your tip to jar@crmtipoftheday.com)

A great man once asked,

can I set a custom schedule for when to run my flows?

The answer is yes, and like so many other solutions it involves nesting flows.

  1. First, create a flow which has a timer trigger, I created one which triggers once per month.
  2. Next, I create parallel branches inside this flow, for each branch you initialize a new variable.
  3. Now add a “do until” loop which says loop until variable equals whatever you need.
  4. Next you add a/multiple start flow action(s) which triggers the flow(s) in question.
  5. Add a delay action which delays for the time period you need.
  6. Finally add a variable incrementor.

Timed flow
Now you’ve got a flow scheduler which can start your flows in whatever fashion you need. Happy pillaging!

Tip #1148: Use visual controls + calculated fields to create an in-form dashboard

Want to add some sizzle to you form configuration? Phil Dudovicz recommends using visual controls with calculated fields to create a nice looking dashboard in your Dynamics 365 Unified Interface form. The following is an example of an investor form that visually displays relevant investor data.

Let’s take a closer look at how this is built:

  1. Current Year return

  • Calculated Field
  • Data Type: Decimal Number
  • Field Type: Calculated
  • ((new_investmentvalue – new_investmentvaluejan1) / new_investmentvalue) * 100
  • Control: Arc Knob
  • Value: hsl_currentyearreturn
  • Min: .1
  • Max: 15
  • Step: .1

2. Return vs. Goal

  • Data Type: Decimal Number
  • Field Type: Calculated
  • (new_investmentvalue / new_investmentgoal) * 100
  • Control: Arc Knob
  • Value: hsl_goalprogress
  • Min: 0
  • Max: 100
  • Step: 1

By combining visual controls with calculated fields you can easily add visual context to a record and give your users a richer user experience.

Thanks for the tip, Phil. Do you have a tip for us? Send it to jar@crmtipoftheday.com.

(Facebook and Twitter cover photo by Paula May on Unsplash)

Tip #1147: Revisiting Queues and Teams

Almost two years ago I wrote on the merits of using Teams vs Queues for managing Cases. While I stand behind what I wrote (Teams are simpler but Queues are more powerful) another element raised its head recently which is worthy of consideration if you are going down the path of setting up Case management.

Teams are a great, simple way of managing Cases. You set the ownership and set up some dashboards to view the Cases and you are away. However, unlike Queues, Teams have another important purpose in Dynamics: security. A common security pattern in Dynamics is to have separate Business Units, each with a ‘sharing Team’ so that Users in one Business Unit can hide their records from the other Business Units or allow specific individuals to see them via the sharing Team. I described this model in Tip 1067. So what happens when these two ideas collide?

For the most part using Teams for security and Case management still works ok. However, if you have a list showing “All Cases Owned By My Teams” make sure the list of Cases are the ones you expect. Similarly if you have a “All Cases Owned By User in My Teams”, make sure these are the Users you are expecting. In short, as you are using Teams as both a grouping for Cases and a grouping for Security, if the groupings are not aligned, there may be confusion and unexpected results.

Tip #1146: Exporting product list items

A reader asks:

I’m sitting there trying to export pricelist items. I was thinking about getting the data as the other tables in excel and then use the import tool to import them in a cloud env. But I I’m not allowed to change the views either. Advanced find doesn’t show the Price List Items table either.

To export price list items:

  1. Go to a price list and navigate to the price list items view. In 2013+, this is a subgrid, so you need to click the grid button on the subgrid to expand the view.
  2. Change the view to “Extended Product Price List – Products.” This view includes what you need to import, including the price list name.
  3. Export to excel using the standard export to Excel button.

If you are starting to build your price lists from scratch and you want an easy way to import them, one of the options for data import templates is price list items. You can go to Settings > Data Management > Templates For Data Import and select the Price List Items template to create an Excel template that you can populate with your prist list items for import.

 

Tip #1145: Tracing in Azure Functions MkIII

This never ends. Shortly after I finished writing about tracing in Azure Functions, I found Daryl “Always Raising” LaBar explaining how to use ExtendedOrganizationService wrapper to easily capture everything in your plugin:

The primary purpose of the tracing wrapper in Azure Functions is to reuse existing functionality you might already have implemented in a separate assemblies. If the existing code uses Trace then my wrapper works nicely redirecting the output to ILogger supplied by the function. But Daryl’s point is that the existing code, if it comes from plugins / workflows would have used ITracingService provided by the execution context. So we need our logger to understand ITracingService as well. Fear not, let me raise the LaBar by standing on his shoulders. We’ll take an extra parameter and implement the interface:

using Microsoft.Extensions.Logging;
using Microsoft.Xrm.Sdk;
using System.Diagnostics;

namespace YourNamespace
{
  // Log writter for working with Azure Functions 
  public class TraceWriterListener : 
               TraceListener, ITracingService
  {
    private ILogger _log;
    private ITracingService _tracer;

    public TraceWriterListener(string name, 
          ILogger logger,
          ITracingService tracer = null) : base(name)
    {
      _log = logger;
      _tracer = tracer;
    }

    public TraceWriterListener(ILogger logger,
          ITracingService tracer = null) : base()
    {
      _log = logger;
      _tracer = tracer;
    }

    // this is the one we have to overwrite to get 
    // the logging level right
    public override void TraceEvent(
          TraceEventCache eventCache, 
          string source, TraceEventType eventType, 
          int id, string message)
    {
      switch(eventType)
      {
        case TraceEventType.Verbose: 
          _log?.LogDebug(message); 
          break;
        case TraceEventType.Information: 
          _log?.LogInformation(message);
          break;
        case TraceEventType.Warning: 
          _log?.LogWarning(message); 
          break;
        case TraceEventType.Error: 
          _log?.LogError(message); 
          break;
        case TraceEventType.Critical: 
          _log?.LogCritical(message); 
          break;
        default:break;
      }
    }

    public override void Write(string message)
    {
      _log?.LogTrace(message);
      _trace?.Trace(message);
    }

    public override void WriteLine(string message)
    {
      _log?.LogTrace(message);
      _trace?.Trace(message);
    }

    // ITracingService implementation
    public void Trace(string fmt, params object[] args)
    {
      _log?.LogTrace(ftm, args);
      _trace?.Trace(fmt, args);
    }
  }
}

Now, you can call your existing AwesomeMethod and pass the wrapper in. That way it will get logged as part of the Functions logging as well as go into the trace output of the Dynamics infrastructure.

public static void Run(
  [TimerTrigger("0 0 0 1 1 *")]TimerInfo myTimer, 
  ILogger log)
{
   // get your tracing service here 
   var tracingService = ...;
   var logger = new TraceWriterListener(
      log, tracingService);
    
   Trace.Listeners.Add(logger);
   Trace.TraceInformation($"Information");
   Trace.TraceWarning($"Warning");
   Trace.TraceError($"Error");
   Trace.WriteLine($"Level comes from ILogger");

   YourSuperDuperExistingClass
     .AwesomeMethod(whatever, logger);
}

(Facebook and Twitter cover photo by Caterina Beleffi on Unsplash)

Tip #1144: How to add business days

Over six months ago I got a tip from Sergio “Behind The Wall” Macías, part of the the driving force behind Spanish-speaking Dynamics 365 community, on how to create a custom workflow activity that adds business days to a specific date in Dynamics 365. In the essence, the calculations are something like this:

while(businessDaysToAdd > 0)
{
  starDate = starDate.AddDays(1);
  if(IsBusinessDay(startDate))
   businessDaysToAdd--;
}

The heart of calculations is, of course, the IsBusinessDay method that would skip obvious Saturdays and Sundays and then go through a holiday calendar to work out if the employees eat turkey instead of helping customers or if it’s business as usual (e.g. help customers eat turkey). (“Go through a holiday calendar” sounds much easier than it actually, is, trust me). I filed the tip with the intention to come back to it until the opportunity presents itself, i.e. until I have a chance to test-drive it using customer’s money, which happened just recently. Because of the vast Dynamics 365 community resources, regardless what the challenge is, I rarely feel the need to either create custom code from scratch or repurpose the existing code. On this occasion, I found two independent open source workflow libraries that would solve my challenge.

Andrii’s Ultimate Workflow Toolkit and Jason’s CRM DateTime Workflow Utilities both have Add Business Days methods though the approach and the parameters are slightly different. Both take existing date and number of days to add as an input, and have the result as an output but that’s where the similarities end.

CRM DateTime Workflow Utilities

The AddBusinessDays activity takes Holiday/Closure Calendar parameter that you can point to a holiday calendar specific for your organization / department / business unit. This is the same calendar that can be used in service scheduling to define availability of the resources. The only downside of the activity implementation is hard-coded Saturdays and Sundays as non-business days. It works in 99% of the cases but on odd occasion business needs to include some of the weekends.

Ultimate Workflow Toolkit

The AddBusinessDays activity in this toolkit does not use holiday calendar but instead takes extra two parameters: weekend days (as a string of numbers from 0 to 6 (Sun to Mon) delimited by ‘|’) and a Fetchxml expression that would return days to skip (holidays). This “developer-friendly” approach works well if your organization does not use calendars but instead keeps holidays in a custom entity.

Waiting

If you adding business days only with the purpose of waiting until the certain number of days passes e.g. remind customer of an unpaid invoice after 5 business days then you might want to consider codeless approach using SLAs.

Thanks to Andrii “Granny’s Moonshine” Butenko and Jason “I can make a kettle talk to CRM” Lattimer for their awesome contributions – I use their toolkits on a daily basis.

(Facebook and Twitter cover photo by Curtis MacNewton on Unsplash)

Tip #1143: Managing Dynamics Goes Beyond Technology

It is easy to focus on training and certifications and think this will be enough to implement or maintain a successful Dynamics system but, especially with process management systems like Dynamics, being tech savvy is not enough.

Certainly knowing the system, its limitations and capabilities is important but understanding the impact of change is vital for success. The traditional paradigm is People, Process, and Technology with a change in one having an impact on the other.

To successfully administer or implement Dynamics, here are some of the skills and talents you will need access to:

People

  • Change communication: The ability to articulate what is coming, why it is beneficial, and to get buy-in
  • Capable trainers/technical writers: The ability to empower the users to embrace the change
  • Effective feedback mechanisms: Telling is one thing but listening is vital

Process

  • Process discovery: Find out how things are done and how they will be impacted through change
  • Process modelling: Documenting process, an often overlooked task, provides the opportunity to reflect and review where the biggest improvements can be made
  • Process performance management: Establishing measures for what defines a successful process provides proof of the benefits Dynamics is bringing

Technology

  • Functional knowledge: No point reinventing the wheel when it comes standard. Knowing what Dynamics provides means knowing how to use it to the best advantage
  • Technical knowledge: Where Dynamics ends, development begins. Knowing how to extend Dynamics takes a good system and makes it great
  • Roadmap knowledge: Knowing the future of Dynamics means developing the system to work with Microsoft and their vision for the product, not against it. Significant effort can be saved with a little hindsight.

Tip #1142: Tracing in Azure Functions MkII

When describing tracing  in Azure Functions previously, I dropped almost in passing that to capture .NET traces in Azure Functions is easy – just create your own TraceListener. I also added that

the code takes a shortcut with log.Info and requires a bit of tuning like mapping logging levels from Connector to TraceWriter but those are the details we can live without

On one of the existing projects, that extensively uses Trace, I had an opportunity to eat my own words. Turns out, we cannot live without those pesky details because logging level will be totally useless in that implementation.

Getting levels under control turned out to be a small challenge because, technically speaking, listener is just that and shouldn’t concern itself with the logging level – you do what you’re told, basically. After some table bending up-and-down head movements, I managed to come up with TraceWriterListener class that takes Azure Functions log interface as a parameter and correctly intercepts all levels of tracing, emitting the appropriate statements. Here is the class in all its glory:

using Microsoft.Extensions.Logging;
using System.Diagnostics;

namespace YourNamespace
{
  // Log writter for working with Azure Functions 
  public class TraceWriterListener : TraceListener
  {
    private ILogger _log;

    public TraceWriterListener(string name, 
          ILogger logger) : base(name)
    {
      _log = logger;
    }

    public TraceWriterListener(ILogger logger) : base()
    {
      _log = logger;
    }

    // this is the one we have to overwrite to get 
    // the logging level right
    public override void TraceEvent(
          TraceEventCache eventCache, 
          string source, TraceEventType eventType, 
          int id, string message)
    {
      switch(eventType)
      {
        case TraceEventType.Verbose: 
          _log?.LogDebug(message); 
          break;
        case TraceEventType.Information: 
          _log?.LogInformation(message);
          break;
        case TraceEventType.Warning: 
          _log?.LogWarning(message); 
          break;
        case TraceEventType.Error: 
          _log?.LogError(message); 
          break;
        case TraceEventType.Critical: 
          _log?.LogCritical(message); 
          break;
        default:break;
      }
    }

    public override void Write(string message)
    {
      _log?.LogTrace(message);
    }

    public override void WriteLine(string message)
    {
      _log?.LogTrace(message);
    }
  }
}

If you add this class to your function project (or drop as csx into your function environment if you are not using projects), then the function code can use something like:

public static void Run(
  [TimerTrigger("0 0 0 1 1 *")]TimerInfo myTimer, 
  ILogger log)
{
   var logger = new TraceWriterListener(log);
   Trace.Listeners.Add(logger);
   Trace.TraceInformation($"Information");
   Trace.TraceWarning($"Warning");
   Trace.TraceError($"Error");
   Trace.WriteLine($"Level comes from ILogger");
}

which makes migration of the existing code fairly easy.

Bonus

You probably noticed the use of ILogger interface where previously TraceWriter was sitting. As of recently quite some time ago, functions now have support for logging through the Microsoft.Extensions.Logging.ILogger interface. As per documentation: at a high level, this is not much different than logging via the TraceWriter: logs continue to go to the file system and will also go to Application Insights (currently in Preview) if the APPINSIGHTS_INSTRUMENTATIONKEY app setting is set. The main advantage of using ILogger is that you get support for structured logging via Application Insights, which allows for richer Analytics support. To use ILogger as your logging interface, simply add a parameter to your function signature and use any of the logger extensions.

Tip #1140: Get ready for October 2018 release

No, Microsoft didn’t invent the time machine (though you can buy one in store). As part of modernizing the way Dynamics 365 gets updated, the team has just released October 2018 release notes.

Why did the notes get released 3 (and for some features 5) months ahead of schedule? So that customers and partners can start planning for the new and exciting capabilities coming to Dynamics 365. This is also a great opportunity to provide early feedback (by emailing releasenotes@microsoft.com) that will be reviewed and incorporated, if applicable and makes sense.

Overall, the document is a mixed bag. It’s understandable because the contributions were coming from the independent teams. Still, it’s hard to place features like “Improved grid with copy and paste” in Business Central on the same level as Channel Integration Framework.

What caught my attention (YMMV, as I have a developer/platform/ISV bias):

  • Sales: Integration with Microsoft Teams and AI capabilities
  • Service: Service scheduling using URS (Unified Resource Scheduling), Omni-channel Engagement Hub, Channel Integration Framework, and Bring Your Own Bot capabilities
  • Portals: SharePoint integration, embedded Power BI charts, and (finally!) configuration migration schema for portals
  • URS: General enhancements, in-form scheduling, and self-service scheduling APIs
  • PowerApps: embedded canvas apps on entity forms, custom size and responsive layout for canvas apps, ALM and admin enhancements, native support for CDS data types
  • Flow: design Flows in Vision o__O, Flow inception (Flow management connector)
  • Data Integration: new and improved development and consumption for connectors

Commitment to online couldn’t be made clearer with the word “premises” mentioned only in relation to Russian localization of Finance and Operations, Dynamics 365 for Retail, Business Central, and On-premises Data gateway.

Download 239 pages (including 14 pages of TOC) of goodness, get a drink of your choice, and spend a few minutes (or hours) reading it – it will be most certainly worth your time.

(Facebook and Twitter cover photo by Bud Helisson on Unsplash)