Wednesday, December 6, 2017

Set Properties not working when using editable grids

I found a little issue using editable grids. The following error occurs in all version since Editable Grids was introduced

When I have editable grids on the primary entity form then I cannot set the values for that entity inside a workflow. To reproduce the error do the following

  1. Create a new custom entity (or use an existing entity if you dare)
  2. Add a 1:N relationship from entity1 to any other entity
  3. Add a sub-grid to the form of entity1
  4. Go to the controls tab of the sub-grid, and choose add control. Select the Editable Grid control, and activate it for web
    image
  5. Save and publish the form
  6. Create a new workflow for the entity
  7. Add an update or create step for the record, and open the “set properties” window
  8. When you select any field, the operator drop-down and field drop-down will be empty, and you cannot select any values
    image

Changing the control to the standard read-only grid will prevent this error. Also, this error only occurs for custom entities with editable grids, not for default entities with editable sub grids.

Diving in to the debug console of your browser you will see the following error:

SCRIPT5009: '$P_CRM' is undefined
File: crminternalutility.js, Line: 1, Column: 1786

Browsing through the code it looks like $P_CRM should be an instance of jQuery, but it is never assigned.

Tuesday, October 24, 2017

Dynamics 365 Virtual Entities quick view bug

In my quest to discover all the possibilities of Dynamics 365 virtual entities I've discovered a bug that can really mess up your day.

When you've added a N:1 relationship from a normal entity to a virtual entity, and you add that field to the entity form
What I did.

  1. Create new virtual entity
  2. Add 1:N relationship from virtual entity to normal entity (account in my case)
  3. Add the relationship field on the account form
  4. Add a quick view control for the virtual entity on the account form
  5. Add a reference to external virtual entity data on existing account
  6. Delete data from external data source
  7. Open account form
  8. An error message is shown, and you cannot open the account form
The error message you get is the following

If you open the log file it states the following error type:
[Microsoft.Xrm.DataProvider.Odata.V4.Plugins: Microsoft.Xrm.DataProvider.Odata.V4.Plugins.ODataRetrievePlugin]

Yes, I'm experiencing the same thing, how can you help me?

It's an easy fix, albeit stupid one. To fix this you need to remove the quick view form (not the lookup field) from the account form, and publish the changes. Alternatively you can open the account in another form where the quick view for the virtual entity isn't present. What you'll see when you open this form is that instead of deleting the value, Dynamics will replace it with a "(No name)" reference

The reason for this is that the clever guys at Microsoft decided that they shouldn't automatically remove/delete the reference, because that would delete your references when the external system is unavailable. However, they don't have a proper check in place for quick view controls on the entity form.
I tried to put the quick view in a hidden control, hidden tab, and even collapsed section, but nothing will prevent this error from triggering. So the there is no work-around as of now.

Just gotta wait for the next iteration I guess :)

Monday, October 23, 2017

Dynamics 365 Virtual Entities unsupported field types

Microsoft recently introduced virtual entities into Dynamics 365. This is an easy way (or is it?) to integrate with external system, removing the need for data duplication while having a data model definition.

Check out Jesper Osgaard's posts on the subject at his blog

The documentation is not exactly updated and optimal for virtual entities yet, so I'm going to add this little tidbit into the world of MSDYN365 Customer Engagement. Some of the unsupported options are documented, but I'm adding them as well because they are available in the UI.
And finally, before we jump into the salad, business required doesn't matter at all, because you cannot edit data in virtual entities (yet). That said, you might be able to in the future, so plan ahead will you?

Adding fields to virtual entities

After you've set up a new virtual entity, when you go to create a new field you get the following options as data types

Keys are not supported

Trying to add alternate keys to a virtual entity will give you the normal editing screen, but as soon as you click OK to create the new key you will be presented with the following error

Fields not supported

First off the two types that is selectable, but will give you an error. Calculated and rollup fields. Trying to add that will give you the following errors:
Attribute field type: Calculated is not valid for virtual entity.
Attribute field type: Rollup is not valid for virtual entity.

Multi-Select option set


At first when you start adding the field it looks like everything should work, but that's only because it's the same editing form as for option sets. Trying to save the field will give you an error message, and if you download the log file it says
Attribute data type: multiselectpicklist is not valid for this entity.

Image

Trying to add an image field will generate an error which is much more cryptic. The message is as follows:
Cannot find the object "new_agreementBase" because it does not exist or you do not have permissions.

Seems like there's a missing null check here, so they're trying to access the database and modify the content.

Currency

Trying to add currencies will product the following error in the log file:
Attribute data type: money is not valid for this entity.

This is probably due to currency (or money) is represented by two tables. One which contains the base conversion, and one which contains the value. It seems like something that could be supported in the future, but right now it's isn't.

Customer

Trying to create a customer field will give a somewhat more cryptic error message
For Relationships referencing Virtual Entites, Cascade link type must be No Cascade

The issue here is that customer is a polymorphic field (kind of like an intersect of other fields) of both the account and contact field, and when you add a customer field it's an implicit cascade delete referential from one field to the other. This relationship cannot be edited, and being read-only the virtual entity only supports cascade-only.
It stands to reason that if/when edit options become available this field will be supported as well.

Supported number ranges

All number types are supported like before (except for those applicable for money, obviously), but for those of you who hoped to be able to build bigger/smaller numbers in an external system I have some bad news: it still has the same size limitation as default fields. That goes for both number size and decimal precision.


Wrap up

So there you have it, all the fields that are NOT supported in MSDYN365 v9 as of right now (October 10th 2017). The virtual entity is still quite new, and Microsoft isn't holding your hand every step of the way just yet. Please drop me a line if you experience something different than what I've posted here, and I'll try to keep up to date as things change in the future.

Also, thanks to @rappen for proof reading and providing better explanations

Wednesday, October 11, 2017

SharePoint retention policy and Office Groups, part 2

In the previous post we looked at how to create a label and apply a policy.
In this part we're going to look at how this will work in a real life scenario.

Labeling content

First of all lets head into the Office Group we created, and look at the site collection overview. I'm saying site collection, because that's basically what it is, but unlike traditional site collections it's not available through SharePoint admin, and it has some extra settings to it. But don't take my word for it, check out Mikael Svenson's blog for a lot of amazing content on everything Office 365.
Opening up the site we see that there's nothing special going on here. It's an empty landing page with the only activity being the creation of the group and the document we uploaded (at least I did).

Going into the documents section, we find the file uploaded in the previous post (or you can just create a new file here). Click on the vertical ellipsis, expand the More category, and click on Compliance details.
This will show you a screen that looks something like this:

Click on the "None" link for the Label status. This will open a page where you can specify the label for this document. Go ahead and select the label we created in part 1, and then hit save.

Deleting the document

Now that we have a document with a label, let's try to see what happens if we try to delete it.
Click the ellipsis and select delete, then click Delete in the confirmation box.
If you set up the archive policy like me, you should be presented with the following message

Great, so that means the stuff we're saving cannot be deleted. We didn't select the checkbox for treating files as records (read-only), so we can still edit the document like we want.

Deleting the group

That's right, we're going to delete the Group already. We now have a Group with one document in it, and the document has a label which says to have a 2 year retention after the last modified on date.
Click on the cogwheel in the top right, then select Site information. In the side bar, click on the Delete site link, and then check the box which says "Yes, delete this group and all its associated resources.". Now click that delete button and cross your fingers.

Where did everything go?

If your tenant is like mine, the group was deleted and you were redirected to the SharePoint root site. Let's take a look around to see if we can find traces of the group.
Going into Outlook we can see that the Group is still there, which is just because Exchange handles these deletions on a schedule that nobody® knows how works. I'm going to go ahead and delete it from Outlook as well. Click on the down-arrow in the top right corner, then select Edit group.
Click on the Delete group link, and in the dialog window check the box stating "I understand that all group content will be deleted"


OK, now everything is gone? No, there's the the matter of the recycling bin. Unfortunately, deleted groups are not available through the UI, so you have to bring out some of your awesome PowerShell skills (or just copy the commands below). Now, I'm assuming you have a modern version of Windows with the possibility to install modules from the shell. If you don't, then you kinda have to do some internet research on how to install them for your environment.

Open PowerShell as an administrator and run the following commands to find a deleted group:

Install-Module AzureADPreview
You will get a warning about the repository, I trust the repo so I'm choosing yes. If you don't trust this repo then I'm afraid this is the end of the line for you, if not select yes and continue.

Import the newly installed module and connect to Azure AD, and then retrieve the deleted groups using the following commands

Import-Module AzureADPreview
Connect-AzureAD
Get-AzureADMSDeletedGroup -SearchString CrmVikingDoesSharePoint

For me that lists out the Group I deleted. Now, do delete the groups, simply run the following command

Get-AzureADMSDeletedGroup -SearchString CrmVikingDoesSharePoint | Remove-AzureA
DMSDeletedDirectoryObject -Id $_.Id

Group deleted, not a single warning presented.
This means that the retention policy provided by labels does not prevent Groups with the content to be deleted.
To recover a group, the following command can be run instead
Get-AzureADMSDeletedGroup -SearchString CrmVikingDoesSharePoint | Restore-AzureA
DMSDeletedDirectoryObject -Id $_.Id

Wrap up: OK, this seems weird.

So it seems that the retention policy set by using labels will prevent us from deleting files by mistake, but it certainly does not protect the site from deletion.
I'm guessing that there's some sort of logical mishap on the server side, because why would you force an administrator to verify that no content inside a group is set to be archived?

This tells me that the retention policies aren't quite production ready, and I have to find some other clever way to use Office Groups that helps reduce Outlook clutter while not deleting all content.

Until next time!

SharePoint retention policy and Office Groups, part 1

While I was working on a specification for a MSDYN365 delivery I started to look into how I could use retention policies and automatic deletion of content. I really like to use Office Groups, so I wanted to check out what we could do with the default retention policies, and how they would work in action.

This first part only explains how to create a labels, policies and groups (from MSDYN365)
Take a look at part 2 to see how it works when everything is in place.

Creating a new label

So the first thing to do is to log in to Office 365 with an administrator. Go to the waffle  and open up Security & Compliance.


From this screen, expand the Classifications category and open up Labels.
Click on the Create a Label button to get the label creation wizard. Give it a suitable name and go to the label settings.
On this screen, I chose to turn on retention, and to set the retention period to be 2 years, starting from the last time the content was modified. I also wanted to trigger a disposition review, instead of just deleting it automatically or keeping it as it was.
I did not want to classify it as a "Record". Doing that would prevent me from editing the content, and this was not meant as an online archive solution so I kept it unchecked.

Then it's just a matter of reviewing the settings and creating the label.
As soon as you're done with this, you get a side bar with basic information and some actions you can perform.

Publishing the label

From the side bar in the previous section click the Publish label button. Alternatively, from the label overview click the publish labels button. Choose the one(s) you want to publish, and hit the next button.
For me I wanted to be able to retain content in Office 365 Groups only, as I'm going to use that in my MSDYN365 deliveries.
So I deselected Exhcange, SharePoint sites and OneDrive accounts, leaving only Office 365 groups.
I could also include or exclude specific groups, but for simplicity I wanted it published to all of them.

Give the policy a fitting name and description, then review the setting.
Please take note of the warning that the labels might take up to 1 day before they're visible to users. Also, notice that it will not be visible in Outlook (desktop client and OWA) for mailboxes that are smaller than 10MB.
Publish the label to make it active.

Now you get a side bar with information about the label, as well as a status (which will most probably say pending).

Creating an Office 365 Group

I assume this step is unnecessary for most people, so I'm just going to show quickly how to create the group from Microsoft Dynamics 365 (customer engagement). If creating a group is new to you I advice you to take a look at the office support documentation

In MSDYN365 I've opened up an account I have, and navigated to the Office Groups sub-content

From this screen I'm going to create a new group. By default, when creating a group from MSDYN365 it will be a private group. It can also only use Exchange groups, so Yammer groups is a no-go.

Once this is done I will get a nice overview of the group content, and I can start using it.
I've uploaded a document which I will use to apply the newly created label for part 2.

The wrap up

Stay tuned for part 2, which will take a look at how labels work in real life. We will delete some content and see how that works.

Tuesday, October 10, 2017

License required to install and upgrade solutions

I noticed something strange when I logged into my MSDYN365 adminstration center today with a tenant administrator.
I reset one of my sandbox instances and deployed a new July Update (v9) version, and after it completed the deployment I went in to add the O365 Group solution.

When I opened up the solutions for the instance, I got the following error message:
You do not have a Dynamics 365 license for this Dynamics 365 instance. Please get a license at portal.office.com.

I thought that this was really weird, because as an administrator you're allowed to create, export and import solutions.
I went into the instance and created a new solution, added the account entity to it. Then, when I tried to export the solution I got the following error message:
You do not have permission to access these records. Contact your Microsoft Dynamics 365 administrator.

So I went into one of my v8.2 instances and did the same operations. No issues, able to create, export and import solutions without any errors.

Went back in to the admin center and tried to manage solutions for my v8.2 instances. Got the same error message and no option to install new or upgrade existing solutions.
Surely this must be a logical slip from Microsoft, so I'm going to try and get in touch with the support team about it.

To wrap up:

  • Administering solutions in the MSDYN365 Administration Center requires licensing as well as tenant or service administrator, for all instance versions.
  • Exporting and importing solutions inside an MSDYN365 instance works for v8.2
  • Exporting and importing solutions inside an MSDYN365 instance does not work for v9.0 (but creating and publishing does).
  • Creating new items is not allowed in any version
  • Publishing all customizations is allowed in all versions

Thursday, September 21, 2017

Dynamics 365 Admin API quirks

Today I finally solved an issue I've been struggling with for 51 days now.
On August 10th Microsoft launched a new API for Microsoft Dynamics 365 Customer Engagement which allows you to perform administrative tasks (finally) for your online environment.
So like to edge kind of guy I am I set forth and tested it out which led to my blog series on the topic (part1, part2, and part3).
The most exciting feature for me was the ability to back up instances to azure storage. Unfortunately, all my attempts to use this function failed with an error message of 500: Internal server error.
I opened up a ticket with Microsoft, and they confirmed my findings and proceeded to escalate it internally.

Now I'm not someone to let something like this just lie, so I've spent quite some time digging into what's wrong with the API. That's when I came across a blog post by Rami Mounla which said that he was using it without issues.
I found that strange, because I was using the example code provided by Microsoft (with some tweaks), so I decided to test it out with the example code straight out of the box

No dice, as you can see I receive a big, fat error 500 in return.
So I tried the next logical step, I excluded some of the values too see what happened if I didn't include the required fields.

Now I got an error 400, stating that I was missing the InstanceId.
Great stuff, now we just need to go by way of deduction. So I started from scratch, including only the required fields. At least I would be able to do a normal MSDYN365 backup then?
No, even with only the required fields it still gave me an error 500.
So I updated my ticket with Microsoft, went on a social media rampage and managed to escalate it to the product group. The product group confirmed the error, but I think they might have other priorities than just my ticket. Luckily, the most awesome Swedish person in the world, Jonas Rapp, used his wide range of connections to get me in touch with the not-less amazing George Doubinsky.
It seems George is getting it to work as well, and he agreed to help me with some debugging.

After hours of debugging and trying all sorts of stuff (remove all headers, try xml, try building the code from scratch, try creating a new Azure AD App, the list goes on), we finally tried the following.
I created a new trial environment, and we both used our own code.
Worked for George, not for me. Obviously I'm doing something wrong?
So George packed together a postman file for me with the authorization header included. Surely that should work on my computer?
No, even with the exact same request I got an error 500, and George got an OK 202.

That was really, really strange to me, so I tried to figure out what was wrong with my machine. I did IE cleanup, reboot, flushed all kerberos tickets. Still nothing. Tried to replay the request in fiddler: still no luck.
Finally, I created a new user on my computer, and installed postman for chrome instead of the desktop client.
BOOM! OK 202. Checked my azure subscription, and right enough there was my new, shining backup.

So I took a screenshot of the entire request and response, and brought it back to my other machine.
Turns out, the Postman extension for Chrome automatically adds the Accept-Language header to the request, while the desktop client does not.
That means that the example provided by Microsoft is the one to blame (thankfully, because I've spent A LOT of time on this.
To make it work, simply add this one little beautiful line into the OAuthMessageHandler class in the authenticationhelper:
request.Headers.AcceptLanguage.Add(new StringWithQualityHeaderValue("en-US"));

So there you have it. For some reason the impementation requirements are not the same for all of the methods in the admin API. On to the next issue!

Thursday, August 24, 2017

Piggybacking on MSDYN365 PluginRegistrationTools ADAL implementation (plagiarizing Mikael Svenson)

My brilliant colleague, Mikael Svenson, wrote a cool blog post on piggybacking on the SharePoint Online Management Shell ADAL application
Inspired by this (relatively) dirty hack I tried to figure out which applications Microsoft has given to us that may support OAuth OOTB.
Turns out, the PluginRegistrationTool does!
I fired up the PluginRegistrationTool from the SDK, hooked fiddler on to it and hit the "create new connection" button in the tool.
Checked the query string in the initial authorize request and Voila!

Splitting up this query string we get the following two values:
client_id=2ad88395-b77d-4561-9441-d0e40824f9bc
redirect_uri=app%3A%2F%2F5d3e90d6-aa8e-48a8-8f2c-58b45cc67315%2F

Cleaning up the redirect_uri gives us this nice app id redirect uri we can use:
app://5d3e90d6-aa8e-48a8-8f2c-58b45cc67315/


This allows us to piggyback on Microsoft's own app registration, which doesn't require approval for users when you distribute an application. Example taken from my blog series on using the new admin api in PowerShell

using Microsoft.IdentityModel.Clients.ActiveDirectory;
using System;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Threading;
using System.Threading.Tasks;

namespace MSDYN365AdminApiAndMore.Helpers
{
    public class AuthenticationHelper
    {
        private static string _clientId = "2ad88395-b77d-4561-9441-d0e40824f9bc";
        private static string _redirectUrl = "app://5d3e90d6-aa8e-48a8-8f2c-58b45cc67315/";

        private Uri _endpoint = null;
        private string _resource = null;
        private string _authority = null;
        private AuthenticationContext _authContext = null;
        private AuthenticationResult _authResult = null;

        public AuthenticationHelper(Uri endpoint)
        {
            _endpoint = endpoint;
        }

        public string Authority
        {
            get
            {
                if (_authority == null)
                {
                    DiscoverAuthority(_endpoint);
                }
                return _authority;
            }
        }

        public AuthenticationContext AuthContext
        {
            get
            {
                if (_authContext == null)
                {
                    _authContext = new AuthenticationContext(Authority, false);
                }
                return _authContext;
            }
        }

        public AuthenticationResult AuthResult
        {
            get
            {
                Authorize();
                return _authResult;
            }
        }

        public HttpMessageHandler Handler
        {
            get
            {
                return new OAuthMessageHandler(this, new HttpClientHandler());
            }
        }

        private void DiscoverAuthority(Uri discoveryUrl)
        {
            try
            {
                Task.Run(async () =>
                {
                    var ap = await AuthenticationParameters.CreateFromResourceUrlAsync(discoveryUrl);
                    _resource = ap.Resource;
                    _authority = ap.Authority;
                }).Wait();
            }
            catch (Exception e)
            {
                throw e;
            }
        }

        private void Authorize()
        {
            if (_authResult == null || _authResult.ExpiresOn.AddMinutes(-30) < DateTime.Now)
            {
                Task.Run(async () =>
                {
                    _authResult = await AuthContext.AcquireTokenAsync(_resource, _clientId, new Uri(_redirectUrl),
                    new PlatformParameters(PromptBehavior.Always));
                }).Wait();
            }
        }

        class OAuthMessageHandler : DelegatingHandler
        {
            AuthenticationHelper _auth = null;
            public OAuthMessageHandler(AuthenticationHelper auth, HttpMessageHandler innerHandler) : base(innerHandler)
            {
                _auth = auth;
            }
            protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
            {
                request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", _auth.AuthResult.AccessToken);
                return base.SendAsync(request, cancellationToken);
            }
        }
    }
}

Final thoughts:

Should you use this? No, probably not. They might change it at any time, or they could introduce connection string inputs for the tool which requires you to register your own app.
Will I be using this? Great question!

Thursday, August 17, 2017

Using Dynamics365 Customer Engagement admin API with PowerShell, part3

In part1 and part2 of this blog series we looked at scaffolding and building an authentication helper which we can use in a commandlet class. In this part we're going to build our own HTTP message handler and perform some queries against the adminapi.
The code in it's entirety is available on this Github repository.

Creating a custom HTTP message handler

As we saw previously we now have a valid bearer token which we can use to query the adminapi. We can now create a HTTP request and send it to the API to get a result back. But instead of starting from scratch we're going to reuse the code from the Microsoft docs and create our own custom message handler which will instantiate and propagate everything we need, as well as injecting the token into the request.
Go into the AuhtenticationHelper class, and append the following code to the end of the class (inside the AuthenticationHelper declaration).

class OAuthMessageHandler : DelegatingHandler
{
    AuthenticationHelper _auth = null;
    public OAuthMessageHandler(AuthenticationHelper auth, HttpMessageHandler innerHandler) : base(innerHandler)
    {
        _auth = auth;
    }
    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
    {
        request.Version = HttpVersion.Version11;
        request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", _auth.AuthResult.AccessToken);
        request.Headers.AcceptLanguage.Add(new StringWithQualityHeaderValue("NORWEGIAN-VIKING"));
        return base.SendAsync(request, cancellationToken);
    }
}
This class inherits from the delegation handler, which takes care of all the basic stuff for us. It has an AuthenticationHelper property which we set in the public constructor, as well as a HTTPMessageHandler which is sent to the base constructor.
Please notice the AcceptLanguage header that has been set. This is not necessary for the GET requests (at the time of writing), but it is required for the POST requests. As you might notice I haven't specified a valid language, and what happens then is that it defaults to english.
If, however, I was to specify nb-NO then the response would be in Norwegian, so there's a nice trick for you.
Next we override the SendAsync method to inject our headers. What we've done here is get the Access token from the AuthResult in the AuthenticationHandler. What this means is the following:

  1. When the AuthResult property is retrieved it triggers the Authorize() method which uses the AuthContext property.
  2. When the AuthContext property is retrieved it instantiates a new AuthenticationContext object with the Authority property.
  3. When the AuthorityProperty is collected the DiscoveryAuthority method is triggered, which retrieves the 401 challenge which gives us the resource and authority based on the service URL set in the public constructor of the AuthenticationHelper class.
This means that everything we need is instantiated and propagated just by setting this one authorization header and accept-language, and it's easy to follow the flow of the code.

Finally we add a public property which will return a new instance of the handler. The handler will be disposed when we complete the request, so we need to make sure that we're instantiating a new one whenever we get it.

public HttpMessageHandler Handler
{
    get
    {
        return new OAuthMessageHandler(this, new HttpClientHandler());
    }
}

We are finally ready to actually perform some requests against the admin API.

Sending requests to the adminapi

To send a request we must first add some code to our commandlet. Add the following lines to the end of the ProcessRecord method to perform the request, and then print the response to the console.

using (var httpClient = new HttpClient(auth.Handler))
{
    var result = httpClient.GetStringAsync(serverUrl).Result;
    Console.WriteLine(result);
    Console.ReadLine();
}
Because we're doing this in a script we're not bothering with async requests. We want the result at once, and we're not doing anything before the response is returned.
Once entered, hit [F5] to start debugging. Log in with the credentials of an MSDYN365 admin, and log in.
If this is the first time you've logged in with that user then you will be presented with the following window which you need to approve
This simply says that it will use your authenticated credentials to perform actions on your behalf, and read the directory information (needed to pass claims to MSDYN365).
It looks more severe than it really is, if you're running code that asks you for credentials then this is not the thing you should be worried about.

Once the request is completed the output to your PowerShell window should look like this:
Congratulations! You're using the new adminapi!

Reusing the connection in additional commandlets

When we extend this project to include more commandlets we should try to reuse our connection. To do this we should make a few changes changes to our commandlet.

[Cmdlet(VerbsCommon.Get, "DynamicsInstances")]
public class GetDynamicsInstances : PSCmdlet
{
    [Parameter(Mandatory = true)]
    [ValidateSet("NorthAmerica", "SouthAmerica", "Canada", "EMEA", "APAC", "Oceania", "Japan", "India", "NorthAmerica2", "UnitedKingdom", IgnoreCase = true)]
    public string Location;

    private AuthenticationHelper _auth = null;

    protected override void ProcessRecord()
    {
        base.ProcessRecord();
        Enum.TryParse(Location, out DataCenterLocations tenantLocation);
        var serverUrl = UrlFactory.GetUrl("admin.services", tenantLocation, "/api/v1/instances");

        if (SessionState.PSVariable.Get("auth") != null)
        {
            _auth = SessionState.PSVariable.Get("auth").Value as AuthenticationHelper;
        }
        else
        {
            _auth = new AuthenticationHelper(serverUrl);
        }

        using (var httpClient = new HttpClient(_auth.Handler))
        {
            var result = httpClient.GetStringAsync(serverUrl).Result;
            Console.WriteLine(result);
            Console.ReadLine();
        }

        SessionState.PSVariable.Set("auth", _auth);
    }
}
As you can see we've now added a private _auth property to the cmdlet.
In addition we put an if clause that checks whether there is an existing PSVariable named "auth". If that's the case then it is assigned to the _auth property.
If it is not present we instantiate a new AuthenticationHelper object and assign that to the property.

At the end of the class we've added a line which sets a PSVariable named "auth", and we set the object to our AuthenticationHelper object. This will store the instantiated AuthenticationHandler object in the PowerShell session, so we can reuse it while in the same session.

To demonstrate this copy the content of this class, and create a new class named GetDynamicsInstaceTypeInfo.
Paste the code in the new class, and change the following lines:
  • Change the cmdlet decoration to say "DynamicsInstanceTypeInfo"
  • Change the class name to GetDynamicsInstanceTypeInfo
  • Change the trailing part of the serverUrl to "/api/v1/instancetypeinfo"
Next go into the properties of the project, and change the command line arguments to the following

-NoLogo -Command "Import-Module '.\MSDYN365AdminApiAndMore.dll'; Get-DynamicsInstances -Location EMEA; Get-DynamicsInstanceTypeInfo -Location EMEA"
Now, start a new debug session, and log in as you did previously. You will get the same list of instances as you did before, but if you hit return then it will perform a new request to get instance types. This request will complete without re-asking for your credentials, which means we're successfully storing and retrieving the PSVariable in our session.

Extending the authentication class to support MSDYN365 data API

Our authentication helper works great, but we make it even greater by making it able to handle normal MSDYN365 auhtentication as well. The problem we face with this is that to get the WWW-Authenticate headers from the MSDYN365 Customer Engagement API we need to use a different URL path than for the admin services.
Where the admin services uses "/api/aad/challenge", the data API uses "/api/data". This means that we'll have to modify the AuthenticationHelper class to take the complete discovery URL as an input in the public constructor. To do this, we're changing the private _endpoint variable to be of type Uri instead of string, and in the Authority property we just pass in the _endpoint instead of the _endpoint and the path.
The result should look like this:

private Uri _endpoint = null;
private string _resource = null;
private string _authority = null;
private AuthenticationContext _authContext = null;
private AuthenticationResult _authResult = null;

public AuthenticationHelper(Uri endpoint)
{
    _endpoint = endpoint;
}

public string Authority
{
    get
    {
        if (_authority == null)
        {
            DiscoverAuthority(_endpoint);
        }
        return _authority;
    }
}

Now, go into the UrlFactory-class and add a new enum named ApiType, and add Admin and CustomerEngagement as values.

public enum ApiType
{
    Admin,
    CustomerEngagement
}
Next add a new static method named GetDiscoveryUrl which takes an Uri and an ApiType enum as input, and returns a Uri.

public static Uri GetDiscoveryUrl(Uri serviceUrl, ApiType type)
{
    var baseUrl = serviceUrl.GetLeftPart(UriPartial.Authority);
    if (type == ApiType.Admin)
    {
        return new Uri(baseUrl + "/api/aad/challenge");
    }
    else if (type == ApiType.CustomerEngagement)
    {
        return new Uri(baseUrl + "/api/data");
    }
    else
    {
        throw new Exception($"Enum with name {type.ToString()} does not have discovery address configured");
    }
}
This allows us to extend with additional APIs in the future, for example for Operations or Financials.

Now, go back into our commandlet classes and modify the else clause to look like this:

else
{
    var discoveryUrl = UrlFactory.GetDiscoveryUrl(serverUrl, ApiType.Admin);
    _auth = new AuthenticationHelper(discoveryUrl);
}
Then change the AuthenticationHelper instantiation to take the discoveryUrl as a parameter instead of the serviceUrl. Remember to change this in both of the commandlets.
Finally, change the PSVariable name from just "auth" to "adminauth", remember to do it for both commandlets, in both when you get and set the variable.

We now have an even more flexible project which can support multiple APIs, and store the authenticated connection in the PowerShell session.

Testing MSDYN365 Customer Engagement

To test our new capabilities, add a new class file to the project named "GetDynamicsWhoAmI", and paste in the following code.

[Cmdlet(VerbsCommon.Get, "DynamicsWhoAmI")]
public class GetDynamicsWhoAmI : PSCmdlet
{
    [Parameter(Mandatory = true)]
    public string Organization;

[Parameter(Mandatory = true)]
[ValidateSet("NorthAmerica", "SouthAmerica", "Canada", "EMEA", "APAC", "Oceania", "Japan", "India", "NorthAmerica2", "UnitedKingdom", IgnoreCase = true)] public string Location; protected override void ProcessRecord() { base.ProcessRecord(); Enum.TryParse(Location, out DataCenterLocations tenantLocation); var customerEngagementUrl = UrlFactory.GetUrl(Organization, tenantLocation, "/XRMServices/2011/organization.svc/web"); AuthenticationHelper customerEngagementAuth = null; if (SessionState.PSVariable.Get("customerengagementauth") != null) { customerEngagementAuth = SessionState.PSVariable.Get("customerengagementauth").Value as AuthenticationHelper; } else { var customerEngagementDiscovery = UrlFactory.GetDiscoveryUrl(customerEngagementUrl, ApiType.CustomerEngagement); customerEngagementAuth = new AuthenticationHelper(customerEngagementDiscovery); } var client = new OrganizationWebProxyClient(customerEngagementUrl, false) { HeaderToken = customerEngagementAuth.AuthResult.AccessToken, SdkClientVersion = "8.2" }; var whoAmI = client.Execute(new WhoAmIRequest()); foreach (var att in whoAmI.Results) { Console.WriteLine($"{att.Key}: {att.Value}");
        }
        Console.ReadLine();

        SessionState.PSVariable.Set("customerengagementauth", customerEngagementAuth);
    }
}

What this does is to get a service and discovery URL for the MSDYN365 Customer Engagement URL for the organization specified. Then it instantiates a new AuthenticationHelper based on the discovery URL.
Then, instead of using a normal HTTP request we instantiate a new OrganizationWebProxyClient, and we inject the OAuth token into the HeaderToken. This means we can do Organization requests against the API, and we can use early bound classes if we've created them (did anyone mention XrmToolBox).
Next we send a new WhoAmIRequest to the service, and we print the values returned to the console.
In addition, we're getting and setting the value as a PSVariable, so we can reuse that as well.

Open up the properties for the project, and inside the debug section change the command line arguments to the following. Remember to change YourOrganizationNameHere to your actual organization name (the X in https://X.crm.dynamics.com), and eventually the location)

-NoLogo -Command "Import-Module '.\MSDYN365AdminApiAndMore.dll'; Get-DynamicsInstances -Location EMEA; Get-DynamicsInstanceTypeInfo -Location EMEA; Get-DynamicsWhoAmI -Organization YourOrganizationNameHere -Location EMEA;"
This will run all of the commandlets we have created so far, so save the changes and hit [F5] to run it.

When it starts it will ask you for credentials just like last time. Provide that and wait for the instance response. When the instances are printed to the console, hit return to start the next query. Now it will not ask you for credentials, it will simply take a few seconds and then return the instance type codes. Hit return again, and now you will get a new window asking you for credentials. This is when the Customer Engagement authentication is instantiated. Fill in the credentials like before, and wait for the response.
If you've done everything correct, you will see the following output in your terminal

Congratulations! You now have the basis for automating almost everything related to your MSDYN365 Customer Engagement environment. Just hit return to end the processing.

The wrap up

So, we now have a new awesome API (with more functions to come), and we have an awesome project which will allow us to write easy-to-use commandlets which can be used to simplify administration (especially for those admins who aren't familiar with the interface) and automate mundane tasks.
So what are we missing from this project now?
Exception handling and unit tests. There really should be more exception handling to this, but I leave that in your capable hands to figure out (or I will update the project later).
In addition, make sure you take a look at Jordi Montana's Fake Xrm Easy for easy unit testing with MSDYN365 Customer Engagement