ian.blair@softstuff

My technical musings

Getting Audit data out of CRM

When retrieving Audit Log information from Dynamics 365 using 

RetrieveAttributeChangeHistoryRequest attributeChangeHistoryRequest =
                            new RetrieveAttributeChangeHistoryRequest {
                                Target = new EntityReference(Contact.EntityLogicalName, contact.Id),
                                AttributeLogicalName = "new_myfield",
                            };

RetrieveAttributeChangeHistoryResponse attributeChangeHistoryResponse =
                            (RetrieveAttributeChangeHistoryResponse) context.Execute(attributeChangeHistoryRequest);

 

The returned data in the AuditDetails collection will contain an action attribute, this will determine the type of operation that took place.

       foreach (var EachEditRecord in attributeChangeHistoryResponse.AuditDetailCollection.AuditDetails)
       {
             AuditDetail attributeDetail = (AuditDetail) EachEditRecord;
             OptionSetValue action = (OptionSetValue) attributeDetail.AuditRecord["action"];
        }

 

A list of actions are below

The date and userid who made the change can be found with

EntityReference userid = (EntityReference) attributeDetail.AuditRecord["userid"];
                            
DateTime? createdon = (DateTime?) attributeDetail.AuditRecord["createdon"];

 

Some of these will contain more information than others and the resulting AuditDetail result will need to be cast as an AtrributeAuditDetail record to find out the old value of the field, and the new value of the field.

var recType = EachEditRecord.GetType();
if (recType == typeof(AttributeAuditDetail)) {
       if (action.Value==2) { //update

       AttributeAuditDetail ar = (AttributeAuditDetail) attributeDetail;
       bool newval = ar.NewValue.Attributes.Contains("new_myfield")
                              ? (bool) ar.NewValue["new_myfield"]
                             : false;


       bool oldval = ar.OldValue.Contains("new_myfield")
                              ? (bool) ar.OldValue["new_myfield"]
                              : false;
     }
}

 The OldValue and NewValue are both entities that will contain the before and after and can be read in the normal way.

 

Updating your CRM V9 connection code

I had a client ring me in a bit of a panic as their CRM system had upgraded to V9 over the weekend and the forms integration I had built for them ages ago had stopped working. The kept getting a 'Connection forcibly closed' error every time they tried to submit a form.
Fortunately I had recoded the connection part when they went to Dynamics365 from CRM2016 so it looked like this.
 
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Tooling.Connector;
using System.ServiceModel;
var connectionString = "url=https://mycrmsystem; Username=myusername; Password=mypassword; authtype=Office365";
CrmServiceClient conn = new CrmServiceClient(connectionString);
using (OrganizationServiceProxy orgService = conn.OrganizationServiceProxy) {
     if (conn.IsReady) {
          // code to do stuff goes here
      }else{
         throw new invalidoperationexception(conn.LastCRMError);
      }
}

To make it work with CRM V9 turned out to be quite simple.
Firstly I had to install .NET v 4.6 on the server that was running the code, previously it was on 4.5.2. Next I had to install the new versions of the Microsoft.Xrm.Sdk.dll, and the Microsoft.Xrm.Tooling.Connector from NuGet as there doesn't seem to be the a huge SDK download anymore.
 
NuGet packages to install
Microsoft.CrmSdk.CoreAssemblies     v9.x
Microsoft.CrmSdk.XrmTooling.CoreAssembly   v9.x
 
Then the code needs 2 new lines for it to work.
Add a using System.Net; line, then as the new connections need TLS1.2 force this in the line
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
anywhere before you actually make the connection. So the code should now look like
 
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Tooling.Connector;
using System.ServiceModel;
using System.Net;

ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
var connectionString = "url=https://mycrmsystem; Username=myusername; Password=mypassword; authtype=Office365";
CrmServiceClient conn = new CrmServiceClient(connectionString);
using (OrganizationServiceProxy orgService = conn.OrganizationServiceProxy) {
     if (conn.IsReady) {
          // code to do stuff goes here
      }else{
         throw new invalidoperationexception(conn.LastCRMError);
      }
}
 
And you are done. 
Easy.

Convert a managed solution to an unmanaged solution

Consider this scenario although I will stress that it is strictly hypothetical as it would be completely stupid to never take any backups, and everybody has a good backup strategy in place especially for portable devices. While you are configuring the CRM system you have a consultant who has a CRM virtual machine on his laptop which is the development environment where all the initial changes happen. Once everyone is happy with the changes it then gets moved across to your production system as a managed solution set. For a while this works very well. However at some point his laptop dies or gets stolen, doesn't really matter either way but you quickly realise there are no backups of the laptop, and more importantly the solution on your production system is managed so it won't be easy to setup a new development system.

Fortunately there is a way around this problem, although you won't be able to convert the live system to unmanaged but you will be able to create a new organisation and import the unmanaged version of the solution into it, effectively recreating the original development environment hopefully with a better backup strategy.

To do this you will require the managed solution file that was used to import the solution into your live environment, it will have a .zip extension and be called something like MySolution_Managed_1_0_0_0.zip, and hopefully that will be checked into your version control system, or at least left in a folder somewhere on a server or desktop.

Once you have the file unzip it into a temporary location.

Open the solution.xml file in either an xml editor or notepad.

Within the file find the node

<Managed>1</Managed>

and change it to

<Managed>0</Managed>

Save the file, and re-zip the solution.

Now create your new environment/organisation and import the solution file normally.

Of course when you transfer the solution in future it makes life easier if you export two versions from your development environment, one managed, and one unmanaged and keep copies!.

 

Of course it goes without saying that if you don't own the copyright to any solution files then don't convert them, this applies to various ISV solutions, add-ons etc.

Create an identical Queue in another CRM system

Sometimes when you have multiple environments creating workflows or dialogs in your DEV system can prove to be an issue especially if queues are involved. Queues aren't included in solutions and if you create a queue with the same name in another system it will have a different Guid so when the workflow that references that is moved across to production it will break.

Its pretty straightforward to create a queue with the same Guid in another system and only requires a few lines of code.

First get the Guid of your queue.

Find the Guid in the settings -> business management -> queues screen and then press the pop-out button. (Highlighted here with an arrow).

Once you have this in the location bar of the new browser window you will see the string that contains the Guid for the queue.

"https://crm2016/testorg/main.aspx?etc=2020&extraqs=&histKey=885738149&id=%7b3D9CB3AB-C26B-E711-80FE-005056877901%7d&newWindow=true&pagetype=entityrecord#78522619

You will need the highlighted section, although the guid will of course be different on your system.

The using Visual Studio and the latest version of the Dynamics365 API you will need code like this.

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Tooling.Connector;
using System.ServiceModel;

var connectionString = "url=https://mycrmsystem; Username=myusername; Password=mypassword; authtype=Office365";
CrmServiceClient conn = new CrmServiceClient(connectionString);
using (OrganizationServiceProxy orgService = conn.OrganizationServiceProxy) {
     if (conn.IsReady) {
       
               Entity newq1 = new Entity("queue");
               newq1["queueid"] = new Guid(<the guid from above>);
               newq1["name"] = "<The name of the queue>";
               orgService.Create(newq1);‚Äč
      }
}

 

This can be wrapped in your favourite type of program, console app or windows form based program and when run will create a queue that you can reference in workflows and dialogs and won't get broken when they move across into production.

 

 

 

Old vs New Connection Methods in CRM2016/Dynamics 365

With the release of the new version of the Crm2016/Dynamics365 SDK the recommended method to connect to CRM in code has changed.

Originally it was (although other methods were available with connection strings)

using Microsoft.Xrm.Sdk;
using System.ServiceModel.Description;

var url = "http://mycrmsystem/XRMServices/2011/Organization.svc";
var username = "myusername";
var password = “mypassword";
var domain = ""
var organizationUri = new Uri(url);            
var credentials = new ClientCredentials();
credentials.UserName.UserName = domain + username;
credentials.UserName.Password = password;
credentials.Windows.ClientCredential.UserName = username;
credentials.Windows.ClientCredential.Password = password;
credentials.Windows.ClientCredential.Domain = domain;

using (OrganizationServiceProxy _service = new OrganizationServiceProxy(organizationUri, null, credentials, null)) {
// code to do stuff goes here
}

With the advent of the tooling connector dll and the other changes in the api this should now be changed to:

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Tooling.Connector;
using System.ServiceModel;

var connectionString = "url=https://mycrmsystem; Username=myusername; Password=mypassword; authtype=Office365";
CrmServiceClient conn = new CrmServiceClient(connectionString);
using (OrganizationServiceProxy orgService = conn.OrganizationServiceProxy) {
     if (conn.IsReady) {
          // code to do stuff goes here
      }else{
         throw new invalidoperationexception(conn.LastCRMError);
      }
}

The key now is the connection string as this determines the connection method with the authtype= parameters.

The example above assumes you are connecting to a Office365 hosted CRM system but if you were connecting to an on-premise active directory system the connection string might be

var connectionString = "url=https://mycrmsystem/myorg; Username=myusername; Password=mypassword; Domain=mydomain; authtype=AD";

 

The other feature is the .IsReady property, if this is set to true the connection has been successful and can be used for further processing, otherwise the properties .LastCRMError and .LastCRMException can be checked to see what went wrong.

 

 

Recovering the licence key in a Dynamics 365 on-premise system.

Needing to verify that the correct licence key had been used for a Dynamics 365 on-premise upgrade I realised that the Deployment Manager will allow you to change the key that’s being used, but it won’t allow you to see the current key.

Being on-premise it made life slightly easier as all I had to do was break out the SQL Management studio and run the following query.

select NVarCharColumn from MSCRM_CONFIG.dbo.ConfigSettingsProperties where ColumnName='LicenseKeyV8RTM'

Of course you will need suitable rights to the MSCRM_CONFIG database to be able to run this query.

Preventing infinite loops in CRM2013/2015/2016 plugins

Imagine the scenario where you have a plugin that executes when an entity is updated. We will call this entity A, and the plugin updates entity B. Not usually a problem, but to make things more interesting we have a plugin on entity B that fires an update back to entity A. This then tries to execute the plugin again and this updates B which updates A again and it causes the plugins to fail with an infinite loop.

Good system design can often get around this and 99% of the time you wont have to worry about it, but for the remaining 1% then IPluginExecutionContext.Depth property comes in very useful.

This property shows the number of times the plugin has been called during the current execution and if it is more than 8 (setting WorkflowSettings.MaxDepth can be changed) the execution fails as the system considers that an infinite loop has occurred.

So in the first example entity A is updated and the plugin executes (Depth=1), B is updated and the other plugin updates, and A is updated again. Our plugin fires again (Depth=2) and B is updated, the other plugin fires and updates A. Our plugin fires again (Depth=3) and so on.

public class SampleOnCreate : IPlugin
{
	public void Execute(IServiceProvider serviceProvider)
	{
		Microsoft.Xrm.Sdk.IPluginExecutionContext context = (Microsoft.Xrm.Sdk.IPluginExecutionContext)serviceProvider.GetService(typeof(Microsoft.Xrm.Sdk.IPluginExecutionContext));
		IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
		IOrganizationService _service = serviceFactory.CreateOrganizationService(context.UserId);
		if (context.Depth > 1) { return; } // only fire once
		// do stuff here
	}
}

 

For most instances if you exit the plugin on context.Depth>1 will stop it running more than once from the main calling entity, and if you want it to be executed by updating entity A which then calls entity B then checking for context.Depth>2 will work, although of course actual code will depend on your requirements.

Have you lost your security icon in CRM 2016?

After recently upgrading a CRM2013 test system in one of my VMs I discovered that I couldnt change any security settings as the option in Settings had completely disappeared. I thought for a while I was going mad, but no it should definitely have been there.

It turns out the fix was really easy to do:

First create a new solution, and add the sitemap to it.

Export it.

Open the zip file and pull out the customizations.xml file and open it in your favourite editor.

Search for the file for Group Id="System_Setting"

<SubArea Id="nav_administration" ResourceId="Homepage_Administration" DescriptionResourceId="Administration_SubArea_Description" Icon="/_imgs/ico_18_administration.gif" Url="/tools/Admin/admin.aspx" AvailableOffline="false" />
<SubArea Id="nav_security" ResourceId="AdminSecurity_SubArea_Title" DescriptionResourceId="AdminSecurity_SubArea_Description" Icon="/_imgs/area/Security_32.png" Url="/tools/AdminSecurity/adminsecurity_area.aspx" AvailableOffline="false" />
<SubArea Id="nav_datamanagement" ResourceId="Homepage_DataManagement" DescriptionResourceId="DataManagement_SubArea_Description" Icon="/_imgs/ico_18_datamanagement.gif" Url="/tools/DataManagement/datamanagement.aspx" AvailableOffline="false" />

 

And between the nav_administration and nav_datamanagement items insert the highlighted block.

Save the file.

Insert the file back into the zip file.

Reimport the solution and publish it.

It might be best to do this out of hours if its a production system as I had to perform an IISRESET before the icon came back for me.

Multithreading in C# to speed up CRM2015 bulk tasks

I had a problem with a piece of software I wrote quite a long time ago, when it was first implemented and data volumes were low it worked fine, but as recently the volume of data it is expected to process has grown beyond all expectations, it was time to revisit the code to see if I could speed things up.

Basically all the software does is send a travel alert email out to people on a mailing list twice a day at a time they specify in 30 minute blocks. The old software worked in a pretty sequential way, first get a list of all the people who were expecting the email during that time slot, then work through the list generating the email from a template, moving it to the correct sending queue, and then sending it, then moving on to the next one. Unfortunately the volumes of traffic now meant that it was taking longer in some 30 minute slots to get all the emails out so some people in the next slots were getting them much later than they wanted so they were no longer useful.

My first idea was to split the process in two, have one process that would generate the emails and then a second that would send them, but my first cut didn't really show much of a speed increase. It meant that instead of an email roughly every 4 seconds, it was now an email every 3 seconds so while it might speed things up a bit it wasn't great.

Next I thought about multithreading, and fortunately in .Net 4 and above this is really easy. When I implemented it on a test VM it went from 20 emails a minute, to 20 emails in 4 seconds which to me is a big enough speed increase to make it worthwhile.

Here is my final code:-

using (_serviceProxy = new OrganizationServiceProxy(crmURI,null, clientCredentials,null)) 
{ 
    _serviceProxy.EnableProxyTypes();
    _service = (IOrganizationService)_serviceProxy;
// do some stuff here like get a list of records to process
    int maxThread = 10; // decide on how many threads I want to process
    SemaphoreSlim sem = new SemaphoreSlim(maxThread); // initialise a semaphore
    foreach (_toprocess tp in ReadWriteCRM.RecordsToProcess)
    {
        sem.Wait(); // if all my threads are in use wait until one becomes available
// then start a new thread and call the make message function
        Task.Factory.StartNew(() => MakeMessage(emailtosend, _service)).ContinueWith(t => sem.Release());
// spawn a new copy of the MakeMessage function and pass 2 parameters to it
// then release the semaphore when the function completes
    }
// this is the import bit, because of the using statement if this is omitted then each task will fail
// because the IOrganizationService will no longer be available
    while (sem.CurrentCount<maxThread)
    {
// .CurrentCount is the number of available tasks so once all the
// tasks are closed it should equal the number you set in maxThread earlier
        Thread.Sleep(2); // let the rest of the system catch up
    }

}

 
private void MakeMessage(_toprocess emailtosend, IOrganizationService _service)
{
    // do stuff here 
    ReadWriteCRM.CreateNewEmailFromTemplate(_service, emailtosend);
}

 

Using the SemaphoreSlim class makes the whole process painless as it easily allows you to decide in advance how many simultaneous tasks you want to run. In the final code I added this value to the configuration file so I can tweak it until I am happy with the balance during final testing.

int maxThread = 10; // decide on how many threads I want to process

SemaphoreSlim sem = new SemaphoreSlim(maxThread); // initialise a semaphore

Next inside the actual process look I added a WAIT command, this will pause the loop until a free task slot becomes available.

sem.Wait();

Then once a slot is available I use the line Task.Factory.StartNew to create a new copy of the function that performs all the work.

Task.Factory.StartNew(() => MakeMessage(emailtosend, _service)).ContinueWith(t => sem.Release());

This starts the function, passes 2 parameters to it and then when the function is done it clears the semaphore so the thread can be used again by another copy.

Initially once I had this and was testing it threw errors to tell me the IOrganization service had been closed.

Cannot access a disposed object.

Object name: 'System.ServiceModel.ChannelFactory`1[Microsoft.Xrm.Sdk.IOrganizationService]'.

This took a little bit of head scratching as sometimes it would run with no errors and other times it would fail and eventually I realised that because I was creating the IOrganization service in a Using statement, a lot of the time the threads would be create and running silently in the background and the Using{} statement would end and dispose of the IOrganization statement especially with the final block of threads. I could remove the Using{} statement altogether and rely on the C# clean-up to get rid of it, or instead I added the following at the end

while (sem.CurrentCount < maxThread)
{
 Thread.Sleep(2);
}

This waits until all the threads are closed before continuing. The sem.CurrentCount shows the number of threads available out of the original number you set in maxThread, so if you set a pool size of 10 initially you just have to wait until sem.CurrentCount==10 again before you let the Using{} statement scope close.

So far in testing this has provided a huge speed increase with very little effort.

Getting OptionSet values out of CRM2015 with C#

Every so often when you are reading a record in code the integer values from the OptionSet fields arent enough because you want to display the whole of the data contained in the record and make it fit for user consumption.

Unfortunately the only way is to pull out the OptionSet values from the fields separately and then do the comparison in code to find the correct text value, but making your own interfaces multilingual or populating your own lists for record creation then becomes quite straightforward.

A function like this will pull out a List of KeyValuePair containing the numeric value of the OptionSetValue and the text value local to the user who is running the code. It is quite easy to extend this to only pull out the text values for a specific user locale.

public static List<KeyValuePair<int, string>> getOptionSetText(IOrganizationService service, string entityName, string attributeName)
{
    List<KeyValuePair<int, string>> values = new List<KeyValuePair<int, string>>();
    RetrieveAttributeRequest retrieveAttributeRequest = new RetrieveAttributeRequest();
    retrieveAttributeRequest.EntityLogicalName = entityName;
    retrieveAttributeRequest.LogicalName = attributeName;
    retrieveAttributeRequest.RetrieveAsIfPublished = false;
    RetrieveAttributeResponse retrieveAttributeResponse = (RetrieveAttributeResponse)service.Execute(retrieveAttributeRequest);
    PicklistAttributeMetadata picklistAttributeMetadata = (PicklistAttributeMetadata)retrieveAttributeResponse.AttributeMetadata;
    OptionSetMetadata optionsetMetadata = picklistAttributeMetadata.OptionSet;

    foreach (OptionMetadata optionMetadata in optionsetMetadata.Options)
    {
        if (optionMetadata.Value.HasValue)
        {
            values.Add(new KeyValuePair<int, string>(optionMetadata.Value.Value, optionMetadata.Label.UserLocalizedLabel.Label));
        }
    }
    return values;
}

To retrieve localised OptionSetValue text instead of using optionMetaData.Label.UnserLocalizedLabel.Label you could substitute it for something like

foreach (LocalizedLabel l in optionMetadata.Label.LocalizedLabels)
{
	string labelText = l.Label;
	int lang = l.LanguageCode;
}

 

This will loop around all the localised values to extract the text and the locale id code.