Thursday, February 14, 2019

Disaster recovery for Dynamics 365 Customer Engagement

Disaster recovery with Data Export Service and Dynamics 365 Customer Engagement 

Disaster recovery is something which is often overlooked or underappreciated in online deployments of Dynamics 365 Customer Engagement. In this blog post we are going to take a look at how we can use the Data Export Service (DES) solution along with the Get Data functionality found in Common Data Service. 

Why disaster recovery? 
Online services come with a lot of awesome functionality which helps mitigate data loss and prolonged periods of downtime. However, no system is ever perfect, and you should always plan for worst case scenarios and how to survive them. There are many, great articles on how to plan and implement disaster recovery plans, so in this article I’m going to focus on how to implement a disaster recovery plan using DES, temporal tables and CDS. 

The mission 
The mission is to implement a disaster recovery plan which enables us to: 
  • Choose a point-in-time restore option 
  • Pick and choose restore options per entity 
  • Have granular control over backed up entities and relationships 

Set up an Azure SQL Database 
Go to https://portal.azure.com and add a new Azure SQL Database. If you don’t have an existing server go ahead and create a new one at the same time. For the pricing tier the DES specifies that an S3 tier is recommended for most deployments, but your mileage may vary. Once it has been created go to the overview section to find the connection strings, copy the ADO.NET string and add the username and password specified for it, you will need that in the next step 

Setting up Data Export Service (DES) 
DES is a native solution for D365CE provided by Microsoft, simply open the Dynamics 365 admin center (https://port.crm<area number>.dynamics.com/g/instances/instancepicker.aspx), click on the organization you want to add DES for, then click the edit solutions button. NB! if the solutions button isn’t showing, just click the organization another time. Locate the data export service and click install. Make sure you read and understand the TOS and click install again to install the solution. 
 
You can come back and refresh this screen to see how the installation progress is going, but it varies a lot so I recommend coming back to it in an hour or so, even though it usually doesn’t take anywhere that long. After the solution is installed open up your Dynamics 365 instance and navigate to Setings => Data Export. Here you will see a web resource which lists all DES profiles, it should be empty if you haven’t used it before. Click on the new button to initiate the profile wizard. 
Click on the blue information button to get a popup which includes a PowerShell script which will setup a key vault for you. Open start and run powershell ise, then copy the script contents inside the editor. Make sure you edit the placeholders to contain the correct values, then hit F5 to run the script. The connection string from the previous section is stored in a secret in the key vault, and there’s a service principal set up which allows Dynamics to access the key vault and retrieve secrets. NB! Permissions in the key vault goes for all of the records, there is no granularity within a category. Keep this in mind in case you want to reuse the vault. 
After the script has run it will output the key vault URL. Copy this and use as the key vault url in the DES profile setup. Feel free to write the delete log, but we don’t need it for this specific guide. 
 
Next select your entities and relationships. Entities depends on what you want to back up, obviously, and relationships will give you lookup tables which makes it really easy to analyze the data (you probably want this). Check out the summary before you complete the wizard, and voila! you've got your first DES policy up and running. 

Configuring temporal tables 
Now for the awesome part! As you might have figured having a DES profile will only ship changes to the Azure SQL Database, so we’re not getting any historical data points. That’s where temporal tables come into play. Temporal tables will automatically create a history record for each change in a table, with a start and end time to designate how it looked at any given time. 
Start by connecting to your database, using the Query Editor in the Azure portal is fine for this task. Run the following statement for each of the tables you need history for (just change the agur_accounts and agur_accountsHistory to reflect the table name). 
ALTER TABLE dbo.agur_accounts 
  ADD SysStartTime datetime2 GENERATED ALWAYS AS ROW START   
  CONSTRAINT DF_agur_accounts_SysStartTime DEFAULT SYSUTCDATETIME() NOT NULL, 
  SysEndTime datetime2 GENERATED ALWAYS AS ROW END  
  CONSTRAINT DF_agur_accounts_SysEndTime DEFAULT CONVERT (DATETIME2, '9999-12-31 23:59:59') NOT NULL, 
         PERIOD FOR SYSTEM_TIME (SysStartTime, SysEndTime) 
  GO 
    
  ALTER TABLE dbo.agur_accounts 
      SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.agur_accountsHistory)) 
  GO 

This adds a starttime and endtime column to the table in question which is automatically populated. Next the table is altered to turn on history, and we specified the history table to be the same name but with History appended to it. What this will do is that whenever a record is modified or deleted, the old value will be pushed into the history table with the endtime set, and the main table will have an updated record with a new starttime. This means we also have track over the deleted records, so that’s the reason why we didn’t need a deletion log in the DES profile. 
Now, to get anything exciting out of this, start by creating, modifying and deleting some records in Dynamics, then head back to the query editor and run some temporal table queries (https://docs.microsoft.com/en-us/sql/relational-databases/tables/querying-data-in-a-system-versioned-temporal-table?view=sql-server-2017) 
I’ve made 6 modifications in addition to 2 creates, so this is the result of a simple query 
 

Getting the data back in 
So far so good, now let’s take a look at how we get this data back inside the system. We start by going into the powerplatform portal, https://web.powerapps.com, and choose the environment you want to restore into from the Environment drop down on the top. Navigate to data -> entities on the left-hand side, and click on the Get Data button in the ribbon. 
This opens the import data wizard. Choose azure database as the data source, then type in the connection settings and proceed. On the next page you can see the tables you want to import from, and from this list you will notice the temporal table(s) you’ve added are also listed. Select the temporal tables to import from and click next. 
On this screen you meet a power query editor, which is an amazing way to query and manipulate data. Scroll all the way to the right and you’ll find the last two columns; starttime and endtime. Here you can filter the dates so that you get the data from the time you specify. I’m specifying end date is after a given date which fits my limited date set, but a more realistic filter would be “startdate before X” AND “enddate after X”. There will not be any duplicates this way, as there is only one unique record at any given time. Then click on the Manage Columns button, and click Select Columns. You will get a list of all the columns, this can be sorted by name instead of the default sort order. Select the columns you want to include, then click next. 
Now it’s time to map entities. I’ve only selected account, so it’s easy for me. Please note that you can create new entities directly from this view. I don’t recommend that from an ALM perspective, but for demo purposes that’s fine. If you haven’t included any key columns then you’ll get a warning, and you’ll have to choose to delete records not used in that case. Finally, select the manual refresh to prevent the system from loading more entities (unless you want that, but that’s outside the scope of this post). Finally, watch the load process as CDS loads the records. This might/will take a while. 
Once it’s finished you can go back into D365CE and see that your data is right there, ready and available! 
 

Wrapping up 
Using Data Export Service, temporal tables and the import data functionality in CDS is a great way to make point-in-time recovery scenarios. There are some caveats to be aware of though. 
  • No out-of-the-box way to retain Guids 
  • Power Query is awesome, but limited to graphical interfaces. Not a sysadmin favorite