Showing posts with label programming. Show all posts
Showing posts with label programming. Show all posts

Thursday, August 24, 2017

Piggybacking on MSDYN365 PluginRegistrationTools ADAL implementation (plagiarizing Mikael Svenson)

My brilliant colleague, Mikael Svenson, wrote a cool blog post on piggybacking on the SharePoint Online Management Shell ADAL application
Inspired by this (relatively) dirty hack I tried to figure out which applications Microsoft has given to us that may support OAuth OOTB.
Turns out, the PluginRegistrationTool does!
I fired up the PluginRegistrationTool from the SDK, hooked fiddler on to it and hit the "create new connection" button in the tool.
Checked the query string in the initial authorize request and Voila!

Splitting up this query string we get the following two values:
client_id=2ad88395-b77d-4561-9441-d0e40824f9bc
redirect_uri=app%3A%2F%2F5d3e90d6-aa8e-48a8-8f2c-58b45cc67315%2F

Cleaning up the redirect_uri gives us this nice app id redirect uri we can use:
app://5d3e90d6-aa8e-48a8-8f2c-58b45cc67315/


This allows us to piggyback on Microsoft's own app registration, which doesn't require approval for users when you distribute an application. Example taken from my blog series on using the new admin api in PowerShell

using Microsoft.IdentityModel.Clients.ActiveDirectory;
using System;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Threading;
using System.Threading.Tasks;

namespace MSDYN365AdminApiAndMore.Helpers
{
    public class AuthenticationHelper
    {
        private static string _clientId = "2ad88395-b77d-4561-9441-d0e40824f9bc";
        private static string _redirectUrl = "app://5d3e90d6-aa8e-48a8-8f2c-58b45cc67315/";

        private Uri _endpoint = null;
        private string _resource = null;
        private string _authority = null;
        private AuthenticationContext _authContext = null;
        private AuthenticationResult _authResult = null;

        public AuthenticationHelper(Uri endpoint)
        {
            _endpoint = endpoint;
        }

        public string Authority
        {
            get
            {
                if (_authority == null)
                {
                    DiscoverAuthority(_endpoint);
                }
                return _authority;
            }
        }

        public AuthenticationContext AuthContext
        {
            get
            {
                if (_authContext == null)
                {
                    _authContext = new AuthenticationContext(Authority, false);
                }
                return _authContext;
            }
        }

        public AuthenticationResult AuthResult
        {
            get
            {
                Authorize();
                return _authResult;
            }
        }

        public HttpMessageHandler Handler
        {
            get
            {
                return new OAuthMessageHandler(this, new HttpClientHandler());
            }
        }

        private void DiscoverAuthority(Uri discoveryUrl)
        {
            try
            {
                Task.Run(async () =>
                {
                    var ap = await AuthenticationParameters.CreateFromResourceUrlAsync(discoveryUrl);
                    _resource = ap.Resource;
                    _authority = ap.Authority;
                }).Wait();
            }
            catch (Exception e)
            {
                throw e;
            }
        }

        private void Authorize()
        {
            if (_authResult == null || _authResult.ExpiresOn.AddMinutes(-30) < DateTime.Now)
            {
                Task.Run(async () =>
                {
                    _authResult = await AuthContext.AcquireTokenAsync(_resource, _clientId, new Uri(_redirectUrl),
                    new PlatformParameters(PromptBehavior.Always));
                }).Wait();
            }
        }

        class OAuthMessageHandler : DelegatingHandler
        {
            AuthenticationHelper _auth = null;
            public OAuthMessageHandler(AuthenticationHelper auth, HttpMessageHandler innerHandler) : base(innerHandler)
            {
                _auth = auth;
            }
            protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
            {
                request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", _auth.AuthResult.AccessToken);
                return base.SendAsync(request, cancellationToken);
            }
        }
    }
}

Final thoughts:

Should you use this? No, probably not. They might change it at any time, or they could introduce connection string inputs for the tool which requires you to register your own app.
Will I be using this? Great question!

Thursday, August 17, 2017

Using Dynamics365 Customer Engagement admin API with PowerShell, part3

In part1 and part2 of this blog series we looked at scaffolding and building an authentication helper which we can use in a commandlet class. In this part we're going to build our own HTTP message handler and perform some queries against the adminapi.
The code in it's entirety is available on this Github repository.

Creating a custom HTTP message handler

As we saw previously we now have a valid bearer token which we can use to query the adminapi. We can now create a HTTP request and send it to the API to get a result back. But instead of starting from scratch we're going to reuse the code from the Microsoft docs and create our own custom message handler which will instantiate and propagate everything we need, as well as injecting the token into the request.
Go into the AuhtenticationHelper class, and append the following code to the end of the class (inside the AuthenticationHelper declaration).

class OAuthMessageHandler : DelegatingHandler
{
    AuthenticationHelper _auth = null;
    public OAuthMessageHandler(AuthenticationHelper auth, HttpMessageHandler innerHandler) : base(innerHandler)
    {
        _auth = auth;
    }
    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
    {
        request.Version = HttpVersion.Version11;
        request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", _auth.AuthResult.AccessToken);
        request.Headers.AcceptLanguage.Add(new StringWithQualityHeaderValue("NORWEGIAN-VIKING"));
        return base.SendAsync(request, cancellationToken);
    }
}
This class inherits from the delegation handler, which takes care of all the basic stuff for us. It has an AuthenticationHelper property which we set in the public constructor, as well as a HTTPMessageHandler which is sent to the base constructor.
Please notice the AcceptLanguage header that has been set. This is not necessary for the GET requests (at the time of writing), but it is required for the POST requests. As you might notice I haven't specified a valid language, and what happens then is that it defaults to english.
If, however, I was to specify nb-NO then the response would be in Norwegian, so there's a nice trick for you.
Next we override the SendAsync method to inject our headers. What we've done here is get the Access token from the AuthResult in the AuthenticationHandler. What this means is the following:

  1. When the AuthResult property is retrieved it triggers the Authorize() method which uses the AuthContext property.
  2. When the AuthContext property is retrieved it instantiates a new AuthenticationContext object with the Authority property.
  3. When the AuthorityProperty is collected the DiscoveryAuthority method is triggered, which retrieves the 401 challenge which gives us the resource and authority based on the service URL set in the public constructor of the AuthenticationHelper class.
This means that everything we need is instantiated and propagated just by setting this one authorization header and accept-language, and it's easy to follow the flow of the code.

Finally we add a public property which will return a new instance of the handler. The handler will be disposed when we complete the request, so we need to make sure that we're instantiating a new one whenever we get it.

public HttpMessageHandler Handler
{
    get
    {
        return new OAuthMessageHandler(this, new HttpClientHandler());
    }
}

We are finally ready to actually perform some requests against the admin API.

Sending requests to the adminapi

To send a request we must first add some code to our commandlet. Add the following lines to the end of the ProcessRecord method to perform the request, and then print the response to the console.

using (var httpClient = new HttpClient(auth.Handler))
{
    var result = httpClient.GetStringAsync(serverUrl).Result;
    Console.WriteLine(result);
    Console.ReadLine();
}
Because we're doing this in a script we're not bothering with async requests. We want the result at once, and we're not doing anything before the response is returned.
Once entered, hit [F5] to start debugging. Log in with the credentials of an MSDYN365 admin, and log in.
If this is the first time you've logged in with that user then you will be presented with the following window which you need to approve
This simply says that it will use your authenticated credentials to perform actions on your behalf, and read the directory information (needed to pass claims to MSDYN365).
It looks more severe than it really is, if you're running code that asks you for credentials then this is not the thing you should be worried about.

Once the request is completed the output to your PowerShell window should look like this:
Congratulations! You're using the new adminapi!

Reusing the connection in additional commandlets

When we extend this project to include more commandlets we should try to reuse our connection. To do this we should make a few changes changes to our commandlet.

[Cmdlet(VerbsCommon.Get, "DynamicsInstances")]
public class GetDynamicsInstances : PSCmdlet
{
    [Parameter(Mandatory = true)]
    [ValidateSet("NorthAmerica", "SouthAmerica", "Canada", "EMEA", "APAC", "Oceania", "Japan", "India", "NorthAmerica2", "UnitedKingdom", IgnoreCase = true)]
    public string Location;

    private AuthenticationHelper _auth = null;

    protected override void ProcessRecord()
    {
        base.ProcessRecord();
        Enum.TryParse(Location, out DataCenterLocations tenantLocation);
        var serverUrl = UrlFactory.GetUrl("admin.services", tenantLocation, "/api/v1/instances");

        if (SessionState.PSVariable.Get("auth") != null)
        {
            _auth = SessionState.PSVariable.Get("auth").Value as AuthenticationHelper;
        }
        else
        {
            _auth = new AuthenticationHelper(serverUrl);
        }

        using (var httpClient = new HttpClient(_auth.Handler))
        {
            var result = httpClient.GetStringAsync(serverUrl).Result;
            Console.WriteLine(result);
            Console.ReadLine();
        }

        SessionState.PSVariable.Set("auth", _auth);
    }
}
As you can see we've now added a private _auth property to the cmdlet.
In addition we put an if clause that checks whether there is an existing PSVariable named "auth". If that's the case then it is assigned to the _auth property.
If it is not present we instantiate a new AuthenticationHelper object and assign that to the property.

At the end of the class we've added a line which sets a PSVariable named "auth", and we set the object to our AuthenticationHelper object. This will store the instantiated AuthenticationHandler object in the PowerShell session, so we can reuse it while in the same session.

To demonstrate this copy the content of this class, and create a new class named GetDynamicsInstaceTypeInfo.
Paste the code in the new class, and change the following lines:
  • Change the cmdlet decoration to say "DynamicsInstanceTypeInfo"
  • Change the class name to GetDynamicsInstanceTypeInfo
  • Change the trailing part of the serverUrl to "/api/v1/instancetypeinfo"
Next go into the properties of the project, and change the command line arguments to the following

-NoLogo -Command "Import-Module '.\MSDYN365AdminApiAndMore.dll'; Get-DynamicsInstances -Location EMEA; Get-DynamicsInstanceTypeInfo -Location EMEA"
Now, start a new debug session, and log in as you did previously. You will get the same list of instances as you did before, but if you hit return then it will perform a new request to get instance types. This request will complete without re-asking for your credentials, which means we're successfully storing and retrieving the PSVariable in our session.

Extending the authentication class to support MSDYN365 data API

Our authentication helper works great, but we make it even greater by making it able to handle normal MSDYN365 auhtentication as well. The problem we face with this is that to get the WWW-Authenticate headers from the MSDYN365 Customer Engagement API we need to use a different URL path than for the admin services.
Where the admin services uses "/api/aad/challenge", the data API uses "/api/data". This means that we'll have to modify the AuthenticationHelper class to take the complete discovery URL as an input in the public constructor. To do this, we're changing the private _endpoint variable to be of type Uri instead of string, and in the Authority property we just pass in the _endpoint instead of the _endpoint and the path.
The result should look like this:

private Uri _endpoint = null;
private string _resource = null;
private string _authority = null;
private AuthenticationContext _authContext = null;
private AuthenticationResult _authResult = null;

public AuthenticationHelper(Uri endpoint)
{
    _endpoint = endpoint;
}

public string Authority
{
    get
    {
        if (_authority == null)
        {
            DiscoverAuthority(_endpoint);
        }
        return _authority;
    }
}

Now, go into the UrlFactory-class and add a new enum named ApiType, and add Admin and CustomerEngagement as values.

public enum ApiType
{
    Admin,
    CustomerEngagement
}
Next add a new static method named GetDiscoveryUrl which takes an Uri and an ApiType enum as input, and returns a Uri.

public static Uri GetDiscoveryUrl(Uri serviceUrl, ApiType type)
{
    var baseUrl = serviceUrl.GetLeftPart(UriPartial.Authority);
    if (type == ApiType.Admin)
    {
        return new Uri(baseUrl + "/api/aad/challenge");
    }
    else if (type == ApiType.CustomerEngagement)
    {
        return new Uri(baseUrl + "/api/data");
    }
    else
    {
        throw new Exception($"Enum with name {type.ToString()} does not have discovery address configured");
    }
}
This allows us to extend with additional APIs in the future, for example for Operations or Financials.

Now, go back into our commandlet classes and modify the else clause to look like this:

else
{
    var discoveryUrl = UrlFactory.GetDiscoveryUrl(serverUrl, ApiType.Admin);
    _auth = new AuthenticationHelper(discoveryUrl);
}
Then change the AuthenticationHelper instantiation to take the discoveryUrl as a parameter instead of the serviceUrl. Remember to change this in both of the commandlets.
Finally, change the PSVariable name from just "auth" to "adminauth", remember to do it for both commandlets, in both when you get and set the variable.

We now have an even more flexible project which can support multiple APIs, and store the authenticated connection in the PowerShell session.

Testing MSDYN365 Customer Engagement

To test our new capabilities, add a new class file to the project named "GetDynamicsWhoAmI", and paste in the following code.

[Cmdlet(VerbsCommon.Get, "DynamicsWhoAmI")]
public class GetDynamicsWhoAmI : PSCmdlet
{
    [Parameter(Mandatory = true)]
    public string Organization;

[Parameter(Mandatory = true)]
[ValidateSet("NorthAmerica", "SouthAmerica", "Canada", "EMEA", "APAC", "Oceania", "Japan", "India", "NorthAmerica2", "UnitedKingdom", IgnoreCase = true)] public string Location; protected override void ProcessRecord() { base.ProcessRecord(); Enum.TryParse(Location, out DataCenterLocations tenantLocation); var customerEngagementUrl = UrlFactory.GetUrl(Organization, tenantLocation, "/XRMServices/2011/organization.svc/web"); AuthenticationHelper customerEngagementAuth = null; if (SessionState.PSVariable.Get("customerengagementauth") != null) { customerEngagementAuth = SessionState.PSVariable.Get("customerengagementauth").Value as AuthenticationHelper; } else { var customerEngagementDiscovery = UrlFactory.GetDiscoveryUrl(customerEngagementUrl, ApiType.CustomerEngagement); customerEngagementAuth = new AuthenticationHelper(customerEngagementDiscovery); } var client = new OrganizationWebProxyClient(customerEngagementUrl, false) { HeaderToken = customerEngagementAuth.AuthResult.AccessToken, SdkClientVersion = "8.2" }; var whoAmI = client.Execute(new WhoAmIRequest()); foreach (var att in whoAmI.Results) { Console.WriteLine($"{att.Key}: {att.Value}");
        }
        Console.ReadLine();

        SessionState.PSVariable.Set("customerengagementauth", customerEngagementAuth);
    }
}

What this does is to get a service and discovery URL for the MSDYN365 Customer Engagement URL for the organization specified. Then it instantiates a new AuthenticationHelper based on the discovery URL.
Then, instead of using a normal HTTP request we instantiate a new OrganizationWebProxyClient, and we inject the OAuth token into the HeaderToken. This means we can do Organization requests against the API, and we can use early bound classes if we've created them (did anyone mention XrmToolBox).
Next we send a new WhoAmIRequest to the service, and we print the values returned to the console.
In addition, we're getting and setting the value as a PSVariable, so we can reuse that as well.

Open up the properties for the project, and inside the debug section change the command line arguments to the following. Remember to change YourOrganizationNameHere to your actual organization name (the X in https://X.crm.dynamics.com), and eventually the location)

-NoLogo -Command "Import-Module '.\MSDYN365AdminApiAndMore.dll'; Get-DynamicsInstances -Location EMEA; Get-DynamicsInstanceTypeInfo -Location EMEA; Get-DynamicsWhoAmI -Organization YourOrganizationNameHere -Location EMEA;"
This will run all of the commandlets we have created so far, so save the changes and hit [F5] to run it.

When it starts it will ask you for credentials just like last time. Provide that and wait for the instance response. When the instances are printed to the console, hit return to start the next query. Now it will not ask you for credentials, it will simply take a few seconds and then return the instance type codes. Hit return again, and now you will get a new window asking you for credentials. This is when the Customer Engagement authentication is instantiated. Fill in the credentials like before, and wait for the response.
If you've done everything correct, you will see the following output in your terminal

Congratulations! You now have the basis for automating almost everything related to your MSDYN365 Customer Engagement environment. Just hit return to end the processing.

The wrap up

So, we now have a new awesome API (with more functions to come), and we have an awesome project which will allow us to write easy-to-use commandlets which can be used to simplify administration (especially for those admins who aren't familiar with the interface) and automate mundane tasks.
So what are we missing from this project now?
Exception handling and unit tests. There really should be more exception handling to this, but I leave that in your capable hands to figure out (or I will update the project later).
In addition, make sure you take a look at Jordi Montana's Fake Xrm Easy for easy unit testing with MSDYN365 Customer Engagement

Monday, October 31, 2016

Microsoft CRM + Azure Service Bus, part 4 (two-way relaying and using Azure Event Hubs)

Integrating Microsoft Dynamics CRM with Microsoft Azure Service Bus, using queues, topics, relays and the Event Hub

In the previous posts we've looked at using endpoints in CRM, how we can create a simply workflow to push the Activity Context to the Azure Service Bus, using queues and creating a simple one-way listener. In this post we'll focus on extending the code to enable two-way relaying, which will allow us to return data back to CRM. We'll also look at how we can integrate CRM with Azure Event Hubs, a service based on ASB which allows for a lot of throughput.

Allow return value in the CRM Workflow step

The first thing we'll do is to write some more logic for our workflow, to enable us to use a return value. The relay service only returns a string, which means that we could easily push JSON or base64 through the bus to have even more meaningful feedback from the service. Be sure to remember the maximum message size for your service bus, and the timeout values. I do not recommend keeping a connection for a long time to transfer end-to-end data, but there are loads of scenarios where you'd like to get something back.

[Output("Return value")]
public OutArgument<string> ReturnValue { get; set; }
OK, so what we've done here is to add an output variable to our workflow step. This allows us to take the return value and use it inside our workflow in later steps.

protected override void Execute(CodeActivityContext executionContext
{
    var context = executionContext.GetExtension<IWorkflowContext>();
    var serviceEndpointNotifySvc = executionContext
        .GetExtension<
IServiceEndpointNotificationService>();
    
var result = serviceEndpointNotifySvc.Execute(
        ServiceEndpoint.Get(executionContext), context
    );

    ReturnValue.Set(executionContext, result);
}

The code looks mostly the same as it did in part2 of this series, with a couple of additions. Firstly, we assign the return value from the notification service's Execute method to a variable "result". Next we set the value of the new output variable "ReturnValue" to the result, using the execution context of the workflow.
Now just build and deploy the updated workflow, and we'll head into MSCRM. Navigate to the Settings -> Processes area, and open up the workflow created earlier. Deactivate the workflow to enable editing, and then add an update step after the custom workflow step.
I'm going to update the Description field on the Account to the ReturnValue from our custom workflow step.


Finally, go into the plugin registration tool to update the relay service we added in part3. Make sure it's changed to a two-way relay, and eventually update the name and path if you want that. Just make sure to update the same values inside the service.

Writing a two-way relay service

To enable two way communication we first have to change our service behavior class. We have to change the interface it inherits from and add a return value.

[ServiceBehavior]
public class RemoteService : ITwoWayServiceEndpointPlugin
{
    public string Execute(RemoteExecutionContext c)
    {
        Console.WriteLine(
            $"Entity: {c.PrimaryEntityName}, id: {c.PrimaryEntityId}");
        return "Message processed in two-way listener";
    }
}

As you can see we're now inheriting from the ITwoWayServiceEndpointPlugin interface, modified the Execute method to expect a return value, and we added a return statement after printing out to the console. This means that whenever our relay service is triggered it will print out the message as before, but will also return our magic string.

The only other change we have to make is to change the typeof implementation. Earlier we specified IServiceEndpointPlugin, so we have to change it to ITwoWayServiceEndpointPlugin. I've done a little dirty hack, because I've been changing back and forth between relay types while testing, so mine looks like this:
sh
    .AddServiceEndpoint(
        typeof(RemoteService).GetInterfaces().First(), 
        new WS2007HttpRelayBinding(), 
        AppSettings["SBusEndpoint"]
    ).Behaviors.Add(tcEndpointBehavior);
sh.Open();
Console.WriteLine("TwoWay-listener, press ENTER to close");
Console.ReadLine();

sh.Close();

While this may seem like reflection and a good idea, it will only work until you've built yourself a framework to reuse across different projects/customers, and all of a sudden you're implementing several interfaces and everything stops working. For this demo, it's OK, but I'd rather specify the interface type in production code.

Testing two-way relaying

Now that we've got that out of the way, it's time to test our new implementation. Start your new and improved service, and go to CRM and trigger the plugin. If everything goes as planned you'll be looking at the following window for your relay service

 And if we go into CRM and check out the description on our contact entity we'll see the following

So that's all it takes to have two-way communication working in a relay service. We've got CRM on one side, Azure Service Bus as the communication channel, and a simple console project on the other side.

What's a REST listener?

One option I haven't mentioned so far is the REST listener which can be specified in the plugin registration tool. This is simply a two-way relay which uses REST endpoints instead of .Net binaries. This would allow you to create and run a relay service in Node.js, apache tomcat, powershell, IIS, or whichever web server you'd want. Just to trigger all the popular buzwords, this can enable you to use MSCRM, Azure Service Bus, and deploy node.js relay services in docker containers.

Azure Event Hubs

Azure Event Hubs is a special kind of service bus which is designed to handle huge amounts of messages, we're talking millions of messages per second. It's probably not what you're looking at for general integration purposes, but there are several scenarios where it could benefit your company in a big way.
The first thing I thought of was using this for auditing. Just write a simple workflow or plugin which pushes any creates, updates and deletes as messages to the event hub. Then you can use stream analytics or some other technology to populate an auditing system with actions performed by your users and integration systems. Anyone who's used the out-of-the-box auditing functions in MSCRM knows that processing the information is tedious, at best, and more often than not close to impossible to get any useful data from. But if you start pushing it into an external auditing system based on Azure services then you could use clever stuff like temporal tables to design a robust, maintainable system.

The second thing I thought of was predictive analysis. Pushing raw data to the event hub, which allows for transfer to an Azure Data Warehouse for real-time data crunching and you have a great additional source of data that can be used for predictive analysis or (buzzword warning) machine learning.

There are probably a lot of cool things you can do with this, but I won't elaborate all my ideas in this blog post. What I want to stress is the price tag. It is incredibly cheap compared to legacy systems based on clusters of servers running different messages queues with some expensive bus technology on top. And the performance is great, no matter which size you pick. It doesn't matter if you're running hundres, thousands, or hundreds of millions of messages per day, the performance is always great but there's no entry level cost, the price scales with usage.


That's all for this blog series (at least for now). I might come back to visit later on when I've done some more in-the-field deployments with the ASB.

Wednesday, February 24, 2016

Using and mocking OrganizationServiceProxy (part 1)

How to use the OrganizationServiceProxy with Dynamics CRM, and mocking it

This is a two-part blog series about how to use the OrganizationServiceProxy class with MSCRM. I'll demonstrate how to connect using explicit, hard coded credentials as well as using the service credentials. I'll finish up by giving some tips on mocking the OrganizationServiceProxy to simplify unit testing in your projects.
In part 1 we'll look at utilizing the OrganizationServiceProxy and creating some code which allows us to easily and flexibly integrate with MSCRM.

Prerequisites

To follow the steps described in this post you'll need to have a version of Microsoft Visual Studio as well as the Windows Identity Foundation framework added to your operating system.
Visual Studio is available in a free (community) version found here
Windows Identity Foundation can be activated with the methods described here

Using OrganizationServiceProxy

Set up the project

I'm going ahead and creating a new Web Application project in Visual Studio. I'm not worrying about hosting and which functions I'll need, so I'll just set up a simple MVC project with defaults. I'm also going ahead and creating a unit test project at the same time, which will be used to demonstrate mocking a bit later on.


When the project has been created, open up the nuget package manager and search for Microsoft.CrmSdk.CoreAssemblies. Add this package to both the MVC project and the Test project. You can also add it using the package manager console with the following commands:
Install-Package Microsoft.CrmSdk.CoreAssemblies

Next add a new, empty controller to your MVC project named CrmController. In the index method we're gonna start by defining a new OrganizationServiceProxy with values as described:

public class CrmController : Controller
{
    // GET: Crm
    public ActionResult Index()
    {
        var crmUrl = new Uri(@"https://crmviking.crm4.dynamics.com");
        var clientCredentials = new ClientCredentials();
        authCredentials.ClientCredentials.UserName.UserName = "username@domain.com";
        authCredentials.ClientCredentials.userName.Password = "password";

        var service = new OrganizationServiceProxy(uri: crmUrl, homeRealmUri: null, clientCredentials: authCredentials.ClientCredentials, deviceCredentials: null);
     
        return View();
    }

}

With just a few lines of code you've already got a working service context which can be used to send and retrieve from Dynamics CRM. I'll explain the different inputs to create a new organizationserviceproxy:
uri: This is the URL to your Dynamics CRM instance
homeRealmUri: This is the WS-TRUST URL to your secondary ADFS server, for example if you're federating across domains. I'm not using it in my case but it could be applicable in your case.
clientCredentials: This is the user credentials used to authenticate with CRM.
deviceCredentials: This is for when you generate device credentials for your service

Refactoring service generation

Now, the next logical step (to me) is moving this out into it's own class, so we can reuse for our other methods. What we're doing is generating new a new service context based on predefined values, so we'll refactor the into it's own CrmServiceFactory class. At the same time we'll extract the credentials values and put them into our web.config file (how to store and use your credentials is a discussion better left for another post, but out of two evils, I'd rather specify them in the web.config than hard coded in your class).
Add the following lines to your web.config, inside the "Configuration" namespace.
<appSettings>
  <add key="CrmUserName" value="name@domain.com" />
  <add key="CrmPassword" value="password" />
</appSettings>
<connectionStrings>
  <add name="CrmWebServer" connectionString="https://crmviking.crm4.dynamics.com" />
</connectionStrings>
<configSections>


Refactoring our code into a factory for generating a new OrganizationServiceProxy gives us the following factory-code:

public static OrganizationServiceProxy GetCrmService()
{
    var crmUrl = new Uri(ConfigurationManager.ConnectionStrings["CrmWebServer"].ConnectionString);
    var authCredentials = new AuthenticationCredentials();
    authCredentials.ClientCredentials.UserName.UserName = ConfigurationManager.AppSettings["CrmUserName"];
    authCredentials.ClientCredentials.UserName.Password = ConfigurationManager.AppSettings["CrmPassword"];

    var creds = new AuthenticationCredentials();
    var service = new OrganizationServiceProxy(uri: crmUrl, homeRealmUri: null, clientCredentials: authCredentials.ClientCredentials, deviceCredentials: null);

    return service;
}

Now we can change the implementation in our controller to simply:
var service = CrmServiceFactory.GetCrmService();


Using service credentials

If we want to use service credentials we start by specifying which credentials will be used to run our application. For an MVC application we do that by specifying the user account settings in the IIS Application Pool. More information about setting the service credentials in IIS is described here (technet).
Next we need to change our code implementation as follows:

public static OrganizationServiceProxy GetCrmService()
{
    var crmUrl = new Uri(ConfigurationManager.ConnectionStrings["CrmWebServer"].ConnectionString);
    var authCredentials = new AuthenticationCredentials();
    authCredentials.ClientCredentials.Windows.ClientCredential = CredentialCache.DefaultNetworkCredentials;

    var creds = new AuthenticationCredentials();
    var service = new OrganizationServiceProxy(uri: crmUrl, homeRealmUri: null, clientCredentials: authCredentials.ClientCredentials, deviceCredentials: null);

    return service;
}

As you can see, what we've changed is replacing the explicit declaration of the username and password and converted to using the credentials that our application is running with.
This way we won't be relying on hard coded values, and we don't risk "giving away" our credentials if somebody snatches up your source code.

Using the organizationserviceproxy

First of, technet has a lot of information and examples on how to use the CRM components, and I highly recommend that you spend a fair amount of time reading up on them. There's a lot more to coding against CRM than using classes in .Net. Here's the url to the OrganizationServiceProxy

Implementing a create method

OK. We'll just create a super simple class which will create an account. We'll name it AccountRepository.

public void Create()
{
    var service = CrmServiceFactory.GetCrmService();
    var account = new Entity(entityName: "account");
    account.Attributes["name"] = "Contoso";
    service.Create(account);
}

That was simple, good to go right? Not quite, if I left it at that the Marvelous Hosk would throw harsh words my way. We have some basic principles we should adhere to, mainly Dependency Injection. We'll modify our code to take in the service in the default constructor, and we'll take the name used to create the account as input for the "Create" method.

private readonly OrganizationServiceProxy service;
public AccountRepository(OrganizationServiceProxy service)
{
    this.service = service;
}
public Guid Create(string name)
{
    var account = new Entity(entityName: "account");
    account.Attributes["name"] = name;
    var accountId = service.Create(account);

    return accountId;
}


OK, that's a bit better, we can reuse the class in different projects without having to rewrite any logic, and we can create accounts with different names as well. In addition, we're returning the unique identity (Guid) of your newly created account, which is useful in a number of different scenarios.

Implementing a retrieve method

Implementing a retrieve method is really simple. We'll just add a method to our existing class as follows:

public Entity Retrieve(Guid id)

{

    var account = service.Retrieve("account", id, new ColumnSet(true));

    return account;

}

That's easy and self explanatory, but unfortunately it requires us to know the account id of the account we're retrieving, and I for one do not go around remembering the Guid of my accounts.
So what we'll do is that we'll change this implementation to querying CRM for an account based on the account name, because that's a value we'll remember. The only thing is, when we do a query we'll retrieve a list of entities. Querying for the account name will potentially give us multiple accounts as a result, so I think we'll go ahead and create a new method, named RetrieveByName.

public EntityCollection RetrieveByName(string name)
{
    var query = new QueryExpression("account");
    query.ColumnSet = new ColumnSet(true);
    query.Criteria.AddCondition(new ConditionExpression("name", ConditionOperator.Equal, name));

    var accounts = service.RetrieveMultiple(query);
    return accounts;
}

Now we're retrieving a collection of entities, if we wanted we could also just return a generic list of entities, but I would rather do that elsewhere in my code than implement logic here which makes the method more rigid and less reusable.

Implementing an update method

As you might expect, updating entities aren't much harder. I'll jump straight into implementing an Update method which an entity. It takes an Entity as input, which means we'll be doing the main manipulation in other classes. This might seem redundant in our example, because we're not doing anything that we couldn't do by just calling the OrganizationServiceProxy's Update method. For most deployments that's probably all you need as well, but for some scenarios you might want to do additional, mandatory manipulation every time an update is performed. You might want to whenever it's called, or you might want to implement a date field which is supposed to be updated whenever an update occurs. Additionally, I like to handle all my organization queries inside my repositories.

public void Update(Entity entity)
{
    service.Update(entity);
}

Easy peasy.

Implementing a status update method

Updating the status of a record is a bit special for Dynamics CRM. Instead of simply updating the state and status using the update method you have to send a SetStateRequest.
Here's the code we'll implement.

public void UpdateStatus(Guid id, int state, int status)
{
    var stateRequest = new SetStateRequest();
    stateRequest.EntityMoniker = new EntityReference("account", id);
    stateRequest.State = new OptionSetValue(state);
    stateRequest.Status = new OptionSetValue(status);

    service.Execute(stateRequest);
}

There's no magic in this code either, but as you might notice it is quite generic. We're already taking in the entity id, the state value and the status value. The only parameter missing is the entity logical name and we could reuse it across all entities, and that's exactly what we'll do. A point I want to make is that we'll be passing in four parameters, and to stay in Uncle Bob's good graces we'll create a model to pass as the input instead.

First off, here's our model

public class CrmStatusModel
{
    public Guid Id { get; set; }
    public string EntityName { get; set; }
    public int StateValue { get; set; }
    public int StatusValue { get; set; }
}


Next, it's our new, generic status update class. I went ahead and named it CrmStatusHandler. Like our repository, I'm going to pass in an organizationserviceproxy in the default constructor.

private readonly OrganizationServiceProxy service;
public CrmStatusHandler(OrganizationServiceProxy service)
{
    this.service = service;
}
public void UpdateStatus(CrmStatusModel model)
{
    var stateRequest = new SetStateRequest();
    stateRequest.EntityMoniker = new EntityReference(model.EntityName, model.Id);
    stateRequest.State = new OptionSetValue(model.StateValue);
    stateRequest.Status = new OptionSetValue(model.StatusValue);

    service.Execute(stateRequest);
}


Now we can use this handler to update the status for all our entities, and we've also got a model instead of four separate parameters.

Create additional entity repositories

Now we've seen how to implement a repository for the account entity. I'm gonna go ahead and create another repository for the contact entity. I'll implement the same methods as we did in the account repository, with the same input parameters, except for the query by name.

private OrganizationServiceProxy service;

public ContactRepository(OrganizationServiceProxy service)
{
    this.service = service;
}
public Guid Create(string name)
{
    var contact = new Entity("contact");
    contact.Attributes["name"] = name;
    var contactId = service.Create(contact);

    return contactId;
}

public Entity Retrieve(Guid id)
{
    var contact = service.Retrieve("contact", id, new ColumnSet(true));
    return contact;
}

public void Update(Entity entity)
{
    service.Update(entity);
}


As you can see, it's pretty much the same as the account, only for the contact entity. In addition, I'll create two methods for querying by values instead. I'll create one method for querying by first name, and one method for querying by last name.

public EntityCollection RetrieveByFirstName(string name)
{
    var query = new QueryExpression("contact");
    query.ColumnSet = new ColumnSet(true);
    query.Criteria.AddCondition(new ConditionExpression("firstname"ConditionOperator.Equal, name));

    var contacts = service.RetrieveMultiple(query);
    return contacts;
}

public EntityCollection RetrieveByLastName(string name)
{
    var query = new QueryExpression("contact");
    query.ColumnSet = new ColumnSet(true);
    query.Criteria.AddCondition(new ConditionExpression("lastname"ConditionOperator.Equal, name));

    var contacts = service.RetrieveMultiple(query);
    return contacts;
}

Create an adapter

Lastly, we'll create an adapter to utilize our repositories. I'm going to simulate a situation where we'll always create a contact whenever an account is created, and we'll create a method to deactivate a company when a contact is deactivated. These aren't necessarily methods you'd want to implement in an actual useful environment, but it's a good example of where you'd want to utilize an adapter pattern to combine the usage of several repositories.

public void CreateCustomers(string accountName, string contactName)
{
    var service = CrmServiceFactory.GetCrmService();
    var accountRepository = new AccountRepository(service);
    var contactRepository = new ContactRepository(service);

    var accountId = accountRepository.Create(accountName);
    var contactId = contactRepository.Create(contactName);


    var contactId = contactRepository.Retrieve(contactId);
    contact.Attributes["parentcustomer"] = new EntityReference("account", accountid);
    contactRepository.Update(contact);
}

public void DeactivateCustomers(Guid contactId)
{
    var service = CrmServiceFactory.GetCrmService();
    var accountRepository = new AccountRepository(service);
    var contactRepository = new ContactRepository(service);
    var statusHandler = new CrmStatusHandler(service);

    var contact = contactRepository.Retrieve(contactId);
    var accountReference = (EntityReference)contact.Attributes["parentcustomer"];

    var contactStatus = new CrmStatusModel()
    {
        EntityName = "contact",
        Id = contactId,
        StateValue = 1,
        StatusValue = 2
    };
    var accountStatus = new CrmStatusModel()
    {
        EntityName = "account",
        Id = accountReference.Id,
        StateValue = 1,
        StatusValue = 2
    };

    statusHandler.UpdateStatus(contactStatus);
    statusHandler.UpdateStatus(accountStatus);
}

The first thing you might notice is that these two methods have some redundant code. That already gives you an inclination that there potential for refactoring and improvement. There's some immediate changes we could make, mainly the number of instantiated classes and yet again breaking the dependency injection rules. The first thing we'll do to reduce the redundancy and get better DI is that we'll add the OrganiationServiceProxy as an input for the public constructor for our adapter. Then, in the public constructor we'll set up our repositories as private readonly properties, so they're available to both of our methods inside the adapter. Another thing to note is that the second method also uses the status update handler we created earlier. Creating a new class instance is cheap, especially when we've already got the OrganizationServiceProxy for our adapters, so I'm going to instantiate the status handler in the constructor as well, even though we might not even use it for a particular instance of the adapter class.

private readonly AccountRepository accountRepository;
private readonly ContactRepository contactRepository;
private readonly CrmStatusHandler statusHandler;

public CustomerAdapter(OrganizationServiceProxy service)
{
    accountRepository = new AccountRepository(service);
    contactRepository = new ContactRepository(service);
    statusHandler = new CrmStatusHandler(service);
}


As you can see this hasn't reduced the amount of lines mentionably, but we've got control of the instances at the top of our class declaration, and we can easily change or manipulate them in the future without changing the values inside each method. We'll do some more with our code in the next part, which is Unit Testing our new classes using Moq, so if you've got objections to the changes just made I'd check that out first.

Disposing your objects

Remember that the OrganizationServiceProxy creates new network connections, and you should always call the dispose method when you're done (or instantiate it in a using statement). The network connections aren't part of the CLR, so even in your MVC/web api project where your controllers are instantiated and thrown away in milliseconds the connections will stay open until the idle time out occurs.

Wrap-up

In this part we've looked at how we can utilize the OrganizationServiceProxy to integrate with Microsoft Dynamics CRM. We've created some repositories, a generic status handler and mixed all of our classes into a nice, extensible adapter class. In part 2 we'll look at unit testing these classes, and mocking the OrganizationServiceProxy using Moq. To do that we need to take a look at interfaces, and you'll understand the decisions made in this part even better.


Until then, happy CRM-ing!