Anonymizing data in UAT/Dev – GDPR

Anonymizing data in UAT/Dev – GDPR

On the eve of GDPR what could be more fitting than a post on GDPR. I think everyone is probably deadly tired of all the consent emails and I think that they will probably even have reached our friends in the US and Asia by now.

This article relates to legal matters on GDPR and are based on my personal interpretations and are not to be viewed as legal advice.

One interesting thing that has to be considered in relation to GDPR is how to handle personal information in non-production environments/instances. Microsoft have made it painfully easy with the instance manager to be able to copy the production instance but do you really have the right to use your customers personal information in a development, UAT or staging environment? Do you have legal support for that? I think that would be a very hard argument to make? Have you gotten your customers explicit consent for using their personal data for that purpose? Probably not. Hence, if you are planning on keeping the instance/environment for more than 30 days, you will need to remove all personal information from to stay within the boundaries of GDPR. I have found that using SSIS with Kingswaysoft and the Anonymization component in the productivity toolkit is very useful. I will in this rather lenghty article describe how I have used it to set up an anoymization script.

First of all you need to download SSDT and Kingswaysoft Dynamics 365 and Productivity Packs

Then start a new Business Intelligence -> Integrations Services project.

Then start by right clicking in the empty field at the bottom where it says “Connection managers” and choose “New connection…”

In the dialog that shows up choose “DynamicsCRM”

You should now see a dialog showing the connection settings to Dynamics.

Connection settings for your instance

Choose the right settings for your instance. Test your connection at the bottom when you are done to make sure it works. Make extra sure you are not connecting to your production environment, wouldn’t want to anonymize that!

When this is done, it is time to make your first Data Flow Task. Work in SSIS is divided into two parts, Control Flow and Data Flow. The control flow is the orchestration, which tells SSIS in which order everything is to be run. If you want thing to run in parallel, just have to boxes next to each other, if you want one Data Flow to run before the other, drag the arrow from the first to the second. It is also possible to have entire “Sequence containers” which can hold several components and make sure they execute before moving to the next stage.

Let’s start by dragging one new Data Flow from the Toolbox on the left hand side to the Control Flow work pane. Then double click it. This will open it up.

You will now see that the tab at the top has changed to “Data Flow” from the previous “Control Flow”. In the “Data Flow” view you will also have a different set of toolbox components available.

In the “Data Flow” you will control a single data flow. For instance the anonymization of Contact.

Start by dragging the Dynamics CRM Source from the toolbox (on the left) to the workspace (the big pane in the middle. Then double click it. – Before looking at the details of the source component, I like using FetchXML when building queries and of course the best way to build queries with FetchXML is using FetchXMLBuilder in XrmToolBox (thanks Jonas Rapp and Tanguy Touzard for all your work!) but if that is too much heavy lifting (it really isn’t), the easiest way to get a FetchXML query is to make an advance find query and export it with the “Download FetchXML” button in the top right hand side of the ribbon of the Advanced Find query builder. So let’s say we have decided the following fields in Contact are personal information and need to be anonymized:

The column editor in advaced find – don’t use composite fields like “fullname” or Address1

Downloading the FetchXml, and setting the Source component in SSDT (Visual Studio) will make it look like this:

I have set the “Connection Manager” to “Target” as we are using the same source and target (reading and writing to the same system.

I am leaving batch size as 2000. Seems to work well. Don’t reduce it too much, remember the API limit of 60 000 calls per 5 min period.

If you would like to try it a bit, you can set the “Max rows returned” to for instance “10” and then try it out a bit to see that it isn’t going crazy.

Source type I like as FetchXML – remember that you can have FetchXMLs with data from several entities which can make queries a lot easier than trying to match the data with lookups in SSIS.

I always try to read all data in UTC and write it in UTC which in most cases makes it correct. But make sure you understand how timezones work if you need to fiddle with this.

Also, don’t include more columns than you need. It will just make your script slow. After adding the FetchXML or changing it, it will try to parse it and read the meta data from Dyn365/CRM. Hence there might be a slight delay. You can check what data you will output from this component by clicking on “Columns” on the left hand side.

When done, press “OK” to go back to the “Data Flow” pane.

Now add a “Data anonymization” component and drag the arrow from the Source component to the Anonymizer. Then double click the anoymizer.

You should see something like this, where I have set anonymization settings for the different columns. By default it will say “Ignore” on all columns.

Try out the different anonymization types. Some are more generic than others. When done, click Ok and go back the data flow pane. Add a Dynamics CRM Destination and drag the blue arrow from the anonymizer (blue arrows are the normal data output, red arrows are error output) to the Dynamics CRM Destination component. Then double click it. The view in the data flow should look something like the picture below.

Dyn365 Soruce -> anoymizer -> writing to Dyn365
Dyn365 destination component – set values where the arrows are

When setting up the destination component, there are a few things to consider:

  • In this case we are always doing updates – hence set the action to update. It is faster than upsert.
  • You have to set the Destination Entity.
  • If you write data to Dyn365 with high latency, no batching and no threads you will be able to update at about 2-3 records per second. With low latency, correct batch setting and multi threading, I have been able to get up to 300 records per second. Very dependant on entity. Hold the mouse of the blue “i” just after the “Enable Multithread Writing” for some deep end tips from the scholars at Kingswaysoft.
  • Error handling is recommended to be directed at a file or some other output where you can monitor it. If you do nothing about it, and you get an error, it will break the flow and stop. You can control error handling of the destination component by clicking on the “Error handling” tab on the left hand side. Remember that all types of exceptions thrown by Dyn365 you will get here as well, like missing rights, disabled records etc.
A normal problem that needs to be handled is that records are deactivated and deactivated records cannot be changed, they have to be opened first, then changed, then re-closed.
This is an example of a dataflow handling this:
Data flow which splits the deactivated records to the left, adds two special columns to reactivate them, writes an update with only the statecode & statusreason to the record and then merges the two data streams and writes the original values, which recloses the ones that were closed from the beginning

This is how the conditional split is defined – if statecode is 0, send them to the output called “open” otherwise send them to the output called closed

Then you can test run your data flow, by right clicking the data flow pane, and pressing “Execute Task”

And when you have assemble an larger control flow – like for instance this:

A control flow with several dataflow being disabled – all in sequence.

you can execute the entire flow by clicking the green plus sign in the ribbon.

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Admin portal for managing your tickets

Admin portal for managing your tickets

In case you haven’t seen it, the Dynamics 365 Admin Portal it is a great place to create and manage you Dynamics 365 tickets.

You get to it using this URL: https://admin.dynamics.com/

I haven’t found a link to the admin portal from the instance manger or any other place in O365 or Dyn365 yet so I have just created a bookmark for it, so I suggest you do the same.

Also, as I am a frequent Microsoft Support user, and often have customers with many instances, I have tried to consolidate many instances into one ticket. They don’t want that. Better to create one ticket for each instance. (probably gets someone a higher salary too :)) However, I did notice that there seems to be some duplicate detection going on, so try to avoid using the same “Issue Summary” for several tickets – as that will just make the later ones disappear without any error message (hint Microsoft, please fix this! – at least give a decent error message).

In order to create tickets, you need to be either O365 Global Admin or Dynamics 365 Service Administrator. There might be some other admin as well that has the right to create tickets, but I don’t think so.

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

New API Limit

New API Limit

Photo by Vidar Nordli-Mathisen on Unsplash

Related to my last post, on working with the API quickly, Microsoft have now released official
documentation that they will, effectivly March the 19:th start limiting the number of API-calls per instance that is allowed to stop what is called “noisy neighbour” problems.

First of all, read the full article here: https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/api-limits 

Let’s break this down a bit, 60 000 calls per 5 minutes, translates to about 200 calls per second. If you break this, you will start getting exceptions, until the 5 minute period has ended. You are expected to back off, and essentialy handle this. That is the short version. Read the full article for more details.

Update: George Doubinski, a friend of mine and one of the brains of CRM Tip of the Day made me aware of the fact that the limit is per user. I will update the article below on what this means.

What does this mean? Is this a problem?

For most organizations, no, at least that I work with, I not even close to breaking this. If they are using some integration tools like Kingswaysoft or other tools which enable multithreaded integrations, but generally do not need that kind of data throughput then you might temporarily be shut down, but it should self heal after some time, as after each 5 minute time span, you will get another 60 000 requests. That could probably quite easily be fixed by checking the settings of the integration tool. Update: Also, if you integrate each system using a separate accont, you do not risk one system temporarily blocking many other systems from integrating to Dynamics 365. If you are using normal users, this will of course entail a certain license cost, why I generally recommend using app users for integrations, if possible. And after this, you should have one app user for each integrating system.

However, there are some organizations where I forsee issues, and these are organizations which have combinations of any of the following criteria:

  1. Third party products which, like Marketing Automation, (ClickDimensions, FreshRelevance, SalesForce Marketing Cloud) which have not had time, or got this in their scope yet, and have large amounts of data that they integrate into Dynamics 365. Update: Especially if the user they are using to integrate, the service user, is a normal user, either used by a normal user, or shared with integrations with other systems.
  2. Legacy Code that has been upgrade to new SDK but uses inefficient architecture – can for example have issues with using ExecuteMultiple which in the article above is described as the recommended best practice. Typically for the reason that the architecure of the code, would require major rewriting to allow for ExecuteMultiple. Update: In this case I strongly recommend looking at using a dedicated user for this specific integration, to isolate any limiations set on that user.
  3. Organizations with multiple heavy integrations to Dynamics 365. Will be hard to control that the sum total does not exceed 60k per second, and handle back-off in a controlled way. The only reasonable way would probably be to rewrite the integrations to use a proxy or queue instead like Azure Service Bus Queues to integrate and have a single integration interface. Probably a lot easier to write in a blog article than to do in real life. Update: This was an incorrection deduction from my part, as it is not based on the sum total, but on the sum per user, this is not a risk unless many integrations use the same user for integration which I do not recommend.
  4. Organizations with complex heavy integrations with thousands of lines of integration code that need to be redesigned, rewritten, tested and deployed before March 19:th. And there is no way to test it as there is no TAP/Beta program for this “Feature”. Update: This is still very relevant. Even such a small change as changing the integrating user for an external system should be thoroughly tested and for larger implementations that can be hard to do before March 19.

Example

I see is a typical B2C organization running Dynamics 365 with a marketing automation addon with email tracking and webtracking. They also have a very time critical integration of orders to be able to handle any incidents. Even if the order integration in itself does not reach the limits, it is not unforseeable that a mailblast, especially a good mail blast, to which many customers read the emails click the links, go their site, check their offers and start ordering, would cause a surge of traffic on the Marketing Automation integration – Dynamics 365 API. This of course depends on the settings of this, but perhaps it is critical that all events be tracked to Dynamics. With a mailblast to let’s say 1 Million recipients, quickly hitting the 60 k/5 min limit would happen. When this happens, this would also block all orders from going to Dynamics, causing an effective stop for working with any new incidents in the system.
Update: This is, of course, only relevant if both systems are integrating using the same user. Don’t. However, the marketing automation system above, would hit the limit fast anyway and if the supplier of this system didn’t have time to update their product/service then it would handle this incorrectly. I recommend checking integrating systems and try to turn down the verbosity of what they are writing to Dynamics 365. Then after March 18 when we see how this falls out in detail, you can test a more verbose setting in a test environment, and then see how that falls out.

Summary

For small and medium companies with low complexity working mainly with B2B. I don’t see that much of a problem. Larger companies with complex integrations, large databases, integrations to webtracking, email tracking which often will be B2C companies which have higher levels of automation and larger databases of customers, will probably have larger problems with this and need to start think about this right now.

We need to come back to this subject post March 19, to see how this will really work. But I think the real problem will be for the larger orgs with many and heavy integrations.

I would be really glad to hear your views on this like I got Georges’.

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Deleting a lot of records fast

Deleting a lot of records fast

A quick one today…

Needed to delete a couple of million records for a customer and the natural thing was to use the Bulk Deletion service, well, I turned it on and it was extremely slow. Only got about 10 records/s which would cause the entire delete to take over a week. I have checked with Microsoft and this is not a bug, but it is working as designed and is not designed to be super fast. According to Microsoft bulk deletion jobs are put on the async queue on low priority to allow other more important jobs higher prio.

And a favorite quote of mine from Purvin Patel of Microsoft Does a dump truck need to outrace a Ferrari?” – and I think that the answer to that question is: it depends. Sometimes it does.  

Personally I would sometimes like it to be as fast as possible when removing a lot of records.

I also checked to see how fast the deletion would be with SSIS and Kingswaysoft. Used the following settings:

  • VM about 5 ms from the Dynamics 365 instance (important that it not be too far, use an Azure VM for this)
  • Used 64 threads
  • Used Execute Multiple batching with 10 (cannot use more that 10 if you are using a lot of threads, ie more than 2)
  • VM has 8 virtual cores and 32 GB memory
  • Loading in batches of 2000. Only loading the id-column, as that is all that is needed.

With this setup, I got somewhere around 345 records deleted per second. Which is a tad more than 34x faster than the bulk delete.

So, want to delete a lot of stuff, maybe Bulk Delete is not the way to go. Not yet anyway, let’s hope Microsoft makes it faster!

(this post was updated on Feb the 9:th 2018)

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Supported browsers – update

Supported browsers – update

One of my most popular posts is a collation on which browsers are actually supported for Dynamics 365 CE. This is not strange since it is a common issue, we often have customers call our support complaining that Dynamics doesn’t work well, and as described in the original article, the cause is usually that they are using an unsupported browser.

Also the Micrsoft Technet article for this: https://docs.microsoft.com/en-us/dynamics365/customer-engagement/admin/supported-web-browsers-and-mobile-devices
and the detailed version: https://technet.microsoft.com/en-us/library/hh699710.aspx mainly focuses on Edge and Internet Explorer compatibility for the different windows versions. And despite Microsoft’s hopes, the rest of the world have not totally turned to all of their programs, yet.

There are currently some issues with version 9.0 as well, why I thought it might be a good idea to write a new article on which browsers are the supported one in a easily overviewable grid.

A brief explanation to the different abbreviations used in the picture:

  • SW – Supported and Working.
  • SW* – Supported and Working on OS X as designed but with somewhat limited functionality. If you compare Dynamics on OS X running Safari with it running Edge on a Win 10 you will notice that there are more options.
  • SW** – The app is not the same thing as the browser at the moment. The Unified Interface will make them almost the same. There will still be differences, in that the app will have better integrations to the device’s hardware, like camera, etc. Especially for the earlier versions, the app was very limited. It is more powerful know. For field service there is also a special Field Service App which is not as HTML/JS based (it is based on Resco).
  • SW*** – I have not had the chance to test if version 9.0 works on a Nexus 10 running Chrome but the official documentation says it should work. However, as v.9.0 does not work with Chrome at the moment, (this will probably be fixed very soon) I would actually be surprised if it worked for Chrome on Nexus 10.
  • NS – Not supported. This does not mean, however, that you cannot get it to work. I have heard of people that mange to get Chrome on OS X to go into some “PC” mode which tricks Dynamics. This still doesn’t make it supported, but it might work for you. The problem might be if you have any issues, that Microsoft Support could leave you hanging.
  • SNW – Supported but Not Working – As for Chrome on Windows, the general story is that Microsoft only support the latest version of it. However, despite version 9.0 official supporting Chrome on Windows, it doesn’t seem to actually work. I would presume that they will fix this soon. The problem is that Dynamics 365 won’t actually load.
The support matrix for the browsers is generally a tricky question. The matrix above is also simplified a bit as I have not included the parameter of the different versions of Operating systems. It can probably be presumed that Microsoft will make greater efforts into making sure that the latest and possibly even some versions before that, that they know customers are using for Edge and Internet Explorer, work well with Dynamics 365. I know some of you will probably think I am crazy for even mentioning Internet Explorer, but I have several customers who still have IE as the company standard browser, and it is not due to Dynamics 365.
As for Firefox, Chrome and Safari, these browsers might be very popular, but from a Microsoft perspective I can see a very tricky situation, and that is that they roll out changes without notifying anyone. This was, for instance, done by Google a couple of years ago when they changed their handling of modal dialogs. This caused many dialog handling features of Dynamics CRM (at that time) to break. It took some time for Microsoft to identify the change, fix it, make sure there were no regression errors, roll it through all testing scenarios and deployment stages and controlls they do until they were able to get a fix out. It did come out quite fast, but there were some people nagging at Microsoft for having a bad product, which I personally find a bit unfair.
The essence of the two paragraphs above is; feel free to use Chrome, Safari or Firefox but I would suggest always having Edge and/or IE as a backup option in the case that an update comes to the main browser that severely affects the usage of Dynamics 365.
And I hope that Microsoft soon fixes the error that stops us from using Chrome with ver. 9.0. Any day now!

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se