Hybrid NTLM Server Side Sync and Exchange 2013 Cert secrets

Hybrid NTLM Server Side Sync and Exchange 2013 Cert secrets

The server side sync is a technology for connecting Dynamics 365 CE to an Exchange server. When connecting an Online Dynamics 365 to an onprem Exchange there are some requirement that need to be met. These can be found here: https://technet.microsoft.com/sv-se/library/mt622059.aspx

Piping data to and from Exchange and Dynamics
By Quartl [CC BY-SA 3.0], from Wikimedia Commons

However, I just had a meeting with Microsoft and based on the version shown 2018-09-05, they have now added some new features that they haven’t had time to get into the documentation yet.

Some of the most interesting parts of the integration is that the it requires Basic Authentication for EWS (Exchange Web Service). Of the three types of authentication available Kerberos, NTLM and Basic, Basic Authentication is, as the name might hint, the least secure. Hence it is also not very well liked by many Exchange admins and may be a blocker for enabling Server Side Sync in Dynamics 365.

In the meeting I just had with Microsoft, they mentioned that they now support NTLM as well! That is great news as that will enable more organizations to enable Server Side Sync.

There is still a requirement on using a user with Application Impersonation rights which might be an issue as that can be viewed as having too high rights within the Exchange server. For this there is currently no good alternative solution. I guess making sure that the Dynamics Admins are trustworthy and knowing that the password is encrypted in Dynamics might ease some of that. But if the impersonation user is compromised, then a haxxor with the right tool or dev skills could compromise the entire Exchange server.

Microsoft also mentioned another common issue that can arise with the Outlook App when using SSS and hybrid connection to an Exchange 2013 onprem. It will show a quick alert saying “Can’t connect to Exchange” but it will be able to load the entire Dynamics parts.

This might be caused by the fact, according to Microsoft, that Exchange 2013, doesn’t automatically create a self-signed certificate that it can use for communication. Hence this has to be done.

This can be fixed by first creating a self signed certificate and then modify the authorization configuration using instruction found here . Lastly publish the certificate. It can also be a good idea to check that the certificate is still valid and hasn’t expired.

I will see if I can create a more detailed instruction on this later.

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Setting up Data Export Service without PowerShell Script

Setting up Data Export Service without PowerShell Script

Setting up Dynamics 365 Data Export Service requires a Azure KeyVault to be set up which is typically done using a PowerShell script which can be found in the Data Export Service setup wizard. However, if you run into issues setting this up, it might be easier to do this directly in Azure by minimizing the steps of the scripts. This was a tip that my friend and Business Solution MVP Scott Durow recommended. He mentions this in his very instructive video, but doesn’t actually show how, so I thought I’d just detail how I made it work.

First some background. The reason why I even started investigating how to do this manually was that when I tried running the PowerShell script supplied by Microsoft in the wizard.

Press the “i” icon to get a window containing the PowerShell Script that Microsoft recommends for setting up the Key Vault.

When running the PowerShell script both as myself (not a global admin) and asking a global admin to do it, it failed in the latter parts. The key vault was created by some of the access policies seemed to be missing and it just didn’t work. My users rights in Azure was Contributor in the Resource Group, and it was a bit interesting cause the global admin and I got different error messages, but when I finally managed to create the key vault manually, I could do it all with my user, so it didn’t seem I was missing any rights to do it.

First step is to make sure you have all your data straight. The power shell script is good for this. Check out Scott clip if you want to know how to find the different strings. He shows it very clearly.

 Just copied from the PS-Script:

$subscriptionId = '<subscription ID>'
$keyvaultName = '
MyVault'
$secretName =
'MySecretName'
$location = '
North Europe'
$connectionString = 'Server=tcp:
<db-name>,
1433;
Initial Catalog=<catalog>;
Persist Security Info=False;
User ID={your_username};
Password={your_password};
MultipleActiveResultSets=False;
Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;'
$organizationIdList = '<DYN365GUID>

$tenantId = ‘<AZURE TENANT ID> 

The highlighted parts have to be replaced by your settings. I will use these variables to have something to reference to further in this article.

Search for Key Vault and add the “Key vault”, the top one in this picture

Then we have to set it up. Not so tricky if you have worked with Azure before. Consider if you want to work in an existing Resourcegroup or if you want to create a new one. Typically you need to have Azure SQL services running as well so it might be good to keep them all together to be able to see the costs and control who has access why a resource group might be a good idea. But that should hence already exist. If not, you can create it. I would recommend keeping Azure SQL and Key vault in the same, not sure if it actually works in different resource groups, probably does, but I haven’t tested.

Creating the key vault – in this case I am creating a new resource group, normally it would already exist

Azure will add you as the default principal with access to the key vault. We will add Data Export Service to this later. For now, just create it.

Now we need to open the Key vault and select the “Secrets” section in the menu on the left hand side and press the button:

“+ Generate/Import” 

Then you have to enter you Secret name ($secretName) and the connection string ($connectionString) into the value.

Creating a secret – $secretname in Name and $connectionstring in Value

Press “Create”.

You should now return to the previous screen and see a row for your secret.
Select it.

It should open the settings panel for the Secret, press the “Tags” part which is located in the middle and add a tag which has $OrgIdList ($organizationIdList) as the key and Tennant ($tenantId) as value. I have blurred them out below as they are rather private.

Adding a tag with OrgIdList and tenantId to a Secret

You then need to go back to the Key Vault and click on the “Access Policies” menu item, you should then see yourself as the principal as this was set when we created the key vault. We now need to add Data Export Service as a valid Principal with read access rights.

So click “Add”, click “Select Principal” and search for “b861dbcc-a7ef-4219-a005-0e4de4ea7dcf” which is the ID for Data Export Service. It should show up like this:

It needs to have “Secret Management Operations – GET” permissions and nothing else.

Now, go back to the Secret and copy the URI to the Secret.

Getting the URI for the Key Vault Secret

Paste it into the Data Export Service Wizard field for Key Vault.

Fill in the other information and press validate. Hopefully it will work out well!

Some issues

Being too cheap with the Azure SQL level
If you don’t go for a Azure SQL P1 and choose a lower tier, you might get this warning:

We tried an S0 for our Dev environment and tried to sync a couple of million records and that just didn’t work, we got tons of errors. We upgraded the ASQL to S2 and then at least we didn’t get any errors. We are planning for P1s in UAT and production.

Might have to set activation date on secret
Seems that you might have to set an activation date on the secret. Not sure why this is, the PS-script doesn’t seem to do this. But not very hard.

Added activation date on the Secret from June 4.th

Using Database schema that is not created
The default database schema is “dbo” in the Data Export Service Wizard. If you change this to something else like “crm” and you haven’t created this in the database, you will get an error. It is simple to fix, you just have to go into the database and create the schema. To create the schema “crm” open a query and run:
CREATE SCHEMA crm

For more information on how to create schemas, check this site: https://docs.microsoft.com/en-us/sql/t-sql/statements/create-schema-transact-sql?view=sql-server-2017

Once the schema has been created, there should be no problem using it, as long as the user has permissions using it.

I hope this works for you. If you have any questions, don’t hesitate to leave a comment.

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Deleting a lot of records fast

Deleting a lot of records fast

A quick one today…

Needed to delete a couple of million records for a customer and the natural thing was to use the Bulk Deletion service, well, I turned it on and it was extremely slow. Only got about 10 records/s which would cause the entire delete to take over a week. I have checked with Microsoft and this is not a bug, but it is working as designed and is not designed to be super fast. According to Microsoft bulk deletion jobs are put on the async queue on low priority to allow other more important jobs higher prio.

And a favorite quote of mine from Purvin Patel of Microsoft Does a dump truck need to outrace a Ferrari?” – and I think that the answer to that question is: it depends. Sometimes it does.  

Personally I would sometimes like it to be as fast as possible when removing a lot of records.

I also checked to see how fast the deletion would be with SSIS and Kingswaysoft. Used the following settings:

  • VM about 5 ms from the Dynamics 365 instance (important that it not be too far, use an Azure VM for this)
  • Used 64 threads
  • Used Execute Multiple batching with 10 (cannot use more that 10 if you are using a lot of threads, ie more than 2)
  • VM has 8 virtual cores and 32 GB memory
  • Loading in batches of 2000. Only loading the id-column, as that is all that is needed.

With this setup, I got somewhere around 345 records deleted per second. Which is a tad more than 34x faster than the bulk delete.

So, want to delete a lot of stuff, maybe Bulk Delete is not the way to go. Not yet anyway, let’s hope Microsoft makes it faster!

(this post was updated on Feb the 9:th 2018)

Gustaf Westerlund
MVP, Founder and Principal Consultant at CRM-konsulterna AB
www.crmkonsulterna.se

Snowden, Prism and if there is anything new under the sun

If you have been reading my blogs previously you might have noticed that I several times (More on the insecurity of dataExternal posting on if your cloud system is safe from the law) have written about the fact that I think that many companies are viewing clouding a bit too lightly, especially from the legal perspective that many countries governments perceive themselves of having the rights to your data if the data either resides in that country, travels through that country or is owned by a company in that country.

From this perspective, I must conclude that the Snowden case is not very surprising. It mearly confirms what I expected to be true all the time and I would say that anybody being surprised is rather showing exceptional naivety towards our governments.

Hence I would just again remind you, that if you have sensitive data, beware of all the threats to the data. There are hackers, lawyers, FBI agents, and XYZ-agents and you have to assess the interest of all potential parties to you data. Then understand and handle the risk in a controlled manner.

Gustaf Westerlund
MVP, CEO and owner at CRM-konsulterna AB
www.crmkonsulterna.se

More on the insecurity of cloud data

As I previously discussed in a guest posting at Software Advice, the legal aspects of cloud computing are interesting and an article recently in “Computer Sweden” (Swedish) again raised this issue with references to the law at hand in the US.

The article references the law Foreign Intelligence Amendments Act, FISAAA, and describes how it can be used without any court order or by-case permits required. According to the interpretation of the law it has some limitations but can rather freely be used to gather non-american data. It does not have to be political data, but can just be data from a forign region affecting US foreign affairs.

As it does not require case-by-case permissions, the interpretation of the law is probably handled quite far down the ranks were this is deemed necessary. The interpretation could hence also be quite wide and I would not be surprised if data such as defence industry business opportunities fall within this area and probably other related areas. Foreign affairs is a wide definition, automobile export, telecomunication equipment and software export is defintley within the boundaries.

I would hence, strongly advice against putting data in countries with legislation similar to this (USA is definetly not the only country). When using the Microsoft Dynamics CRM Online service in Europe, the case is a bit better as the data is stored in countries within the EU:s juristiction. The US law is, however, interesting in this part as it focuses on the companies being american and not where the data is actually stored. Hence, the fact might be that the US government might be able to push Microsoft/SalesForce/Google or any other american cloud systems supplier into handing over data backed up by this law even if the data is stored in other countries.

To be on the safe side, from the legal aspect, storing the data in your own servers run by your own people, is always the safest. Their loyalty lies with your company and you have control over the physical storage of the data. Do not that this, however, is no safeguard agains hackers.

Gustaf Westerlund
MVP, CEO and owner at CRM-konsulterna AB
www.crmkonsulterna.se