Tuesday, March 5, 2019

You have been judged

Arctic Cloud Developer Challenge was amazing
As you may know I was a judge during this years Arctic Cloud Developer Challenge (#acdc2019)
It was a fantastic event with a lot of great content created by the participants. This year was The Simpsons themed and all the teams did a great job creating solutions with a related branding. This year we had special categories for security vulnerability, ethical hacker and black hacker, which would give you negative or positive points. These categories were not overly used, though the Swedes managed to set everyone back 50 points at one time.
We also had one lightning challenge per judge, and while I outsourced mine to the fantastic Elaiza Benitez the other judges chose some really cool scenarios.
For this event we got some really cool MX Chips from Ben Vollmer, my favourite person in Microsoft, and we handed out 1 to each team so they could use them for the IoT category. All the teams were actually doing IoT stuff, which I think is really awesome. IoT is one of the things I believe will be part of everything in the future, in fact I think the abbreviation will disappear in time and we will just take it for a given that everything around us is connected, so I was very happy to see that all the teams made an effort.
We had 3 judging sessions over the three days, where we would give out points to each time in each category, and the winner of a category would get a crown to show off that they are “the ones to watch” for that category. On the second and third day we would collect the crowns and hand them out to the new winner. By the end of the second day the Three Eyed Fishes (Skill) were clearly in front with quite a few points, and I think a lot of the teams got nervous.
On the final day the teams went into full focus mode, and the mood was very intense for several hours. Unfortunately for the Three Eyed Fishes they lost 2 team members the last 24 hours due to family situations, so they had a huge handicap in terms of planning and redistributing responsibilities among the other members, and the competition did not slack off in the face of potential victory.
After the third and final judging the new team scores looked much more even, and the popular vote points could swing it any direction


The teams got 10 minutes to present their solution, and at the end all the attendees, judges, guests and organizers got to vote for their favourite. Each vote counted 15 points, and the teams could not vote for themselves (which would give them 15 minus points). It was really close, but in the end the Swedish Snowballs (DQC) brought home the victory by a margin of 55 points over the Three Eyed Fishes.

This was an amazing performance by the Swedish Snowballs, as they went from last place to first place in 24 hours. The big thing about the final day was that each crown was worth 100 points, so winning any category on the final day would not only give 100 points for the category, but an additional bonus of 100 for going home with the crown. The Swedish Snowballs managed to capture three crowns, and here’s a quick overview of which and why
  • The Genesis Tub (IoT category): The first two days they struggled a lot with their devices, and they brought quite a few of them to the show. However, on the third they had worked through all their issues, and they had 3 working Raspberry Pi units placed around the venue with cameras, looking for Nelson to try and avoid him. The main thing that swayed the judges was the fact that they had experimented with facial recognition for a long time, and finally decided to move the processing off the pies and created an online service which did all the processing for them. This got the processing time down from well over 10 seconds to around 1 second, making it a much more fluid solution. This is a great demonstration of how IoT edge devices should not be assigned the responsibility of processing, but only do the work they were meant for (image capturing and motion detection in this case).
  • Obey the Hypno Toad (UI+UX): The entire interface was built in react, which means it worked great on all devices when they first got it working. However, they made sure to go the extra mile by taking Simpsons sprites and animating them, and they created custom icons for picking a character. Additionally, they added an alert system which included audible and visual alerts, so you would get notification if Nelson was spotted on your path. They also considered that iOS removed alerts from their latest updates, so while platform alerts worked on Android and Windows, the iOS devices would get a visual notification on the top of their screen, which managed to be informative without getting in the way of the map navigation. In other words the UI was not only visually nice to look at, it was also designed in a way which would be possible to integrate into the display unit in a modern car without obstructing the navigation.
  • Automated Teller Machineyolatrolamaton (business value): The reason for winning this was that they had several clear, understandable business scenarios for their solution. It was something they could use for Safari tours and parks to make sure that people could go where the animals were. It could also be used to warn people about dangerous animals in an area, to identify poachers, to create evacuation routes in a natural disaster situation, etc. Additionally, it was working and deployed to a public web site, so the product was ready for sale at the time we went around judging.

As judges we were very impressed with all of the solutions we were presented, and I oblige you as a reader to take a look at the Arctic Cloud Developer Challenge blogs to see what all the teams built. https://acdcblog.azurewebsites.net/
Finally a big shout out to the sponsors for making this event possible, it is probably the most awesome hackathon in the
Now I just have to cross my fingers and hope to be invited to this awesome event again

Thursday, February 14, 2019

Disaster recovery for Dynamics 365 Customer Engagement

Disaster recovery with Data Export Service and Dynamics 365 Customer Engagement 

Disaster recovery is something which is often overlooked or underappreciated in online deployments of Dynamics 365 Customer Engagement. In this blog post we are going to take a look at how we can use the Data Export Service (DES) solution along with the Get Data functionality found in Common Data Service. 

Why disaster recovery? 
Online services come with a lot of awesome functionality which helps mitigate data loss and prolonged periods of downtime. However, no system is ever perfect, and you should always plan for worst case scenarios and how to survive them. There are many, great articles on how to plan and implement disaster recovery plans, so in this article I’m going to focus on how to implement a disaster recovery plan using DES, temporal tables and CDS. 

The mission 
The mission is to implement a disaster recovery plan which enables us to: 
  • Choose a point-in-time restore option 
  • Pick and choose restore options per entity 
  • Have granular control over backed up entities and relationships 

Set up an Azure SQL Database 
Go to https://portal.azure.com and add a new Azure SQL Database. If you don’t have an existing server go ahead and create a new one at the same time. For the pricing tier the DES specifies that an S3 tier is recommended for most deployments, but your mileage may vary. Once it has been created go to the overview section to find the connection strings, copy the ADO.NET string and add the username and password specified for it, you will need that in the next step 

Setting up Data Export Service (DES) 
DES is a native solution for D365CE provided by Microsoft, simply open the Dynamics 365 admin center (https://port.crm<area number>.dynamics.com/g/instances/instancepicker.aspx), click on the organization you want to add DES for, then click the edit solutions button. NB! if the solutions button isn’t showing, just click the organization another time. Locate the data export service and click install. Make sure you read and understand the TOS and click install again to install the solution. 
 
You can come back and refresh this screen to see how the installation progress is going, but it varies a lot so I recommend coming back to it in an hour or so, even though it usually doesn’t take anywhere that long. After the solution is installed open up your Dynamics 365 instance and navigate to Setings => Data Export. Here you will see a web resource which lists all DES profiles, it should be empty if you haven’t used it before. Click on the new button to initiate the profile wizard. 
Click on the blue information button to get a popup which includes a PowerShell script which will setup a key vault for you. Open start and run powershell ise, then copy the script contents inside the editor. Make sure you edit the placeholders to contain the correct values, then hit F5 to run the script. The connection string from the previous section is stored in a secret in the key vault, and there’s a service principal set up which allows Dynamics to access the key vault and retrieve secrets. NB! Permissions in the key vault goes for all of the records, there is no granularity within a category. Keep this in mind in case you want to reuse the vault. 
After the script has run it will output the key vault URL. Copy this and use as the key vault url in the DES profile setup. Feel free to write the delete log, but we don’t need it for this specific guide. 
 
Next select your entities and relationships. Entities depends on what you want to back up, obviously, and relationships will give you lookup tables which makes it really easy to analyze the data (you probably want this). Check out the summary before you complete the wizard, and voila! you've got your first DES policy up and running. 

Configuring temporal tables 
Now for the awesome part! As you might have figured having a DES profile will only ship changes to the Azure SQL Database, so we’re not getting any historical data points. That’s where temporal tables come into play. Temporal tables will automatically create a history record for each change in a table, with a start and end time to designate how it looked at any given time. 
Start by connecting to your database, using the Query Editor in the Azure portal is fine for this task. Run the following statement for each of the tables you need history for (just change the agur_accounts and agur_accountsHistory to reflect the table name). 
ALTER TABLE dbo.agur_accounts 
  ADD SysStartTime datetime2 GENERATED ALWAYS AS ROW START   
  CONSTRAINT DF_agur_accounts_SysStartTime DEFAULT SYSUTCDATETIME() NOT NULL, 
  SysEndTime datetime2 GENERATED ALWAYS AS ROW END  
  CONSTRAINT DF_agur_accounts_SysEndTime DEFAULT CONVERT (DATETIME2, '9999-12-31 23:59:59') NOT NULL, 
         PERIOD FOR SYSTEM_TIME (SysStartTime, SysEndTime) 
  GO 
    
  ALTER TABLE dbo.agur_accounts 
      SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.agur_accountsHistory)) 
  GO 

This adds a starttime and endtime column to the table in question which is automatically populated. Next the table is altered to turn on history, and we specified the history table to be the same name but with History appended to it. What this will do is that whenever a record is modified or deleted, the old value will be pushed into the history table with the endtime set, and the main table will have an updated record with a new starttime. This means we also have track over the deleted records, so that’s the reason why we didn’t need a deletion log in the DES profile. 
Now, to get anything exciting out of this, start by creating, modifying and deleting some records in Dynamics, then head back to the query editor and run some temporal table queries (https://docs.microsoft.com/en-us/sql/relational-databases/tables/querying-data-in-a-system-versioned-temporal-table?view=sql-server-2017) 
I’ve made 6 modifications in addition to 2 creates, so this is the result of a simple query 
 

Getting the data back in 
So far so good, now let’s take a look at how we get this data back inside the system. We start by going into the powerplatform portal, https://web.powerapps.com, and choose the environment you want to restore into from the Environment drop down on the top. Navigate to data -> entities on the left-hand side, and click on the Get Data button in the ribbon. 
This opens the import data wizard. Choose azure database as the data source, then type in the connection settings and proceed. On the next page you can see the tables you want to import from, and from this list you will notice the temporal table(s) you’ve added are also listed. Select the temporal tables to import from and click next. 
On this screen you meet a power query editor, which is an amazing way to query and manipulate data. Scroll all the way to the right and you’ll find the last two columns; starttime and endtime. Here you can filter the dates so that you get the data from the time you specify. I’m specifying end date is after a given date which fits my limited date set, but a more realistic filter would be “startdate before X” AND “enddate after X”. There will not be any duplicates this way, as there is only one unique record at any given time. Then click on the Manage Columns button, and click Select Columns. You will get a list of all the columns, this can be sorted by name instead of the default sort order. Select the columns you want to include, then click next. 
Now it’s time to map entities. I’ve only selected account, so it’s easy for me. Please note that you can create new entities directly from this view. I don’t recommend that from an ALM perspective, but for demo purposes that’s fine. If you haven’t included any key columns then you’ll get a warning, and you’ll have to choose to delete records not used in that case. Finally, select the manual refresh to prevent the system from loading more entities (unless you want that, but that’s outside the scope of this post). Finally, watch the load process as CDS loads the records. This might/will take a while. 
Once it’s finished you can go back into D365CE and see that your data is right there, ready and available! 
 

Wrapping up 
Using Data Export Service, temporal tables and the import data functionality in CDS is a great way to make point-in-time recovery scenarios. There are some caveats to be aware of though. 
  • No out-of-the-box way to retain Guids 
  • Power Query is awesome, but limited to graphical interfaces. Not a sysadmin favorite