Some people might have heard about an industry best practice that you should never have custom columns (fields) on the systemuser table (entity) in dataverse. Is this true and why so? This article is based on my understanding of how the inner workings of dataverse works and hence what you need to think about when designing your application to not unintentionally create an application that destroys your environments performance. In short, be careful about adding custom columns to the systemuser and if you do, only add fields that have static data, ie data that doesn’t often change. Let me describe this in more detail.
First of all, I would like to give credit to a lot of this to my friend and former Business Application MVP Adam Vero, who described this in detail for me, I have also discussed this with other people and since had it confirmed but not actually seen it documented as such, why it might not be fully official. I do, however, not see any problems with people understanding this, rather the opposite.
Dataverse is an application platform that has security built into it as an integral part, there are security roles, system users, teams and business units that form the core pieces of the security in the system. As the system will often need to query data from these four tables, it has a built in “caching” functionality that per-environment loads these four tables and precalculates them into an in-memory table for easy and fast access. This is then then stored in-memory for as long as the data in these four tables is kept static, in other words, nothing is changed, no updates, no creates, no deletes.
What could then happen if you add a column to the systemuser table? Well, that depends. If this column is a column that you set when the user is created and then never change that, that isn’t a problem, as this wouldn’t affect the precalculated in-memory table. However, if the data of these columns are constantly being changed, like for instance, if you add a column called “activities last 24h” and then create a Flow which every time an email, appointment etc. is created it will increment this by one per day and reset it every night for every user. Then every time, this writes to any user, the precalculated in-memory table will be flushed and recalculated before it can be used again causing a severe performance hit that can be very hard to troubleshoot.
How would you create a solution for a the “activities last 24h”, as described above then? Well, I would probably create a related entity called userstatistics with a relationship to systemuser. In this case it could even be smarter to have a 1:N relationship to this other entity as you could then have many userstatistics per user and measure differences in activities day by day.
But wouldn’t the NLB:s (Network Load Balancer) make this irrelevant as each environment is hosted together with many others? Well, I cannot, due to NDA talk about the details of how the NLBs actually work for the online environements, but I can say this, no, it is still relevant. The NLB will make it so for performance reasons.
As for teams, it is only the owner teams that count in this equation, the access teams are only being used for sharing or other types of grouping and hence never part of this pre-calculation.
And the savvy person would then of course realize that the multiplied size of:
systemusers x owner teams x business units x security roles
Does make up the size of this pre calculated table and for large implemenations, this can give indications of where performance can start to make a difference as every time a user does not have organizational level privilige, the system has to go through the entire table to check what is right. And then of course the POA. But that table is story for another day and another article.
Just final word. The platform is constantly shifting and even though this was true and probably still is true, there might be changes going on or that have happened that I am unaware of, that have changed how this works. If I hear of this, I will let you know.
The server side sync is a technology for connecting Dynamics 365 CE to an Exchange server. When connecting an Online Dynamics 365 to an onprem Exchange there are some requirement that need to be met. These can be found here: https://technet.microsoft.com/sv-se/library/mt622059.aspx
However, I just had a meeting with Microsoft and based on the version shown 2018-09-05, they have now added some new features that they haven’t had time to get into the documentation yet.
Some of the most interesting parts of the integration is that the it requires Basic Authentication for EWS (Exchange Web Service). Of the three types of authentication available Kerberos, NTLM and Basic, Basic Authentication is, as the name might hint, the least secure. Hence it is also not very well liked by many Exchange admins and may be a blocker for enabling Server Side Sync in Dynamics 365.
In the meeting I just had with Microsoft, they mentioned that they now support NTLM as well! That is great news as that will enable more organizations to enable Server Side Sync.
There is still a requirement on using a user with Application Impersonation rights which might be an issue as that can be viewed as having too high rights within the Exchange server. For this there is currently no good alternative solution. I guess making sure that the Dynamics Admins are trustworthy and knowing that the password is encrypted in Dynamics might ease some of that. But if the impersonation user is compromised, then a haxxor with the right tool or dev skills could compromise the entire Exchange server.
Microsoft also mentioned another common issue that can arise with the Outlook App when using SSS and hybrid connection to an Exchange 2013 onprem. It will show a quick alert saying “Can’t connect to Exchange” but it will be able to load the entire Dynamics parts.
This might be caused by the fact, according to Microsoft, that Exchange 2013, doesn’t automatically create a self-signed certificate that it can use for communication. Hence this has to be done.
This can be fixed by first creating a self signed certificate and then modify the authorization configuration using instruction found here . Lastly publish the certificate. It can also be a good idea to check that the certificate is still valid and hasn’t expired.
I will see if I can create a more detailed instruction on this later.
MVP, Founder and Principal Consultant at CRM-konsulterna AB
As I have previously discussed (on Software Advice in this article) the US laws regarding the rights for the US legal authorities to order US owned companies to hand over data is very strong. Hence, if one has sensitive data that might be of interest to any government agencies in the US, one should think once or twice about storing it in a data center owned by a US company (like Salesforce.com, Amazon, Microsoft, Google etc.).
A recent case in in New York has shown that this is not on theory but very much practice. In this case regarding an email account probably on outlook.com however the difference to SalesForce.com or CRM Online is purely academic.
On the positive side, Microsoft are fighting back trying to protect their customer, something I am very happy about. I do hope they do this for all customers.
The other direct positive side to this of course, is that Microsoft CRM can be aquired from other sources than from companies based in USA. For instance a standard CRM On-premise installation, or a partner hosted installation, which there are many service providers of, like Midpoint and our my own company CRM-Konsulterna in Sweden. SalesForce.com on the other hand, do not have this option, so if you have sensitive data, be careful, it might be ordered into the wrong hands.
MVP, CEO and owner at CRM-konsulterna AB
If you have been reading my blogs previously you might have noticed that I several times (More on the insecurity of data, External posting on if your cloud system is safe from the law) have written about the fact that I think that many companies are viewing clouding a bit too lightly, especially from the legal perspective that many countries governments perceive themselves of having the rights to your data if the data either resides in that country, travels through that country or is owned by a company in that country.
From this perspective, I must conclude that the Snowden case is not very surprising. It mearly confirms what I expected to be true all the time and I would say that anybody being surprised is rather showing exceptional naivety towards our governments.
Hence I would just again remind you, that if you have sensitive data, beware of all the threats to the data. There are hackers, lawyers, FBI agents, and XYZ-agents and you have to assess the interest of all potential parties to you data. Then understand and handle the risk in a controlled manner.
MVP, CEO and owner at CRM-konsulterna AB