UI Testing with EasyRepro and ADFS

Last week I stubbled across a tweet from Wael Hamze. You might know him from his VSTS addon XRM CI Framework. This time he posted about the Microsoft UI Testing Framework for Dynamics 365 on his blog article.

I followed it and it was working really good, unless I struggled with the login. In my case the Dynamics 365 is still forwarding to an on-premise ADFS website to authenticate. And EasyRepro can only handle the O365 authentication out of the box.


So I did some UI testing against my demo instance and figured out, that it works fine. Of cause it is not fast, as all the other tools I had a look so far. But I liked the easy approach of it.

Anyhow due to the login-limitations, I couldn’t create the tests against the system, where I wanted to automate the testing.

Optional Login Parameter was the solution

After a while, I figured out, that the Login-method does allow custom action to override the authentication part. And this is finally the solution.

instead of just calling the Login Method like this:

xrmBrowser.LoginPage.Login(_xrmUri, _username, _password);

I was calling it with a reference to a custom handler:

xrmBrowser.LoginPage.Login(_xrmUri, _username, _password, LoginViaAdfs);

Now I only had to overwrite the authentication part. I had to find the Input fields for username and password and click the submit button. Afterwards I had to disagree the “Stay Signed In” page and wait for Dynamics to come up.

Here is the resulting method I added:

public void LoginViaAdfs(LoginRedirectEventArgs args)
     var d = args.Driver;

    // Username and password field by ID
     if (d.IsVisible(By.Id("ctl00_ContentPlaceHolder1_UsernameTextBox")))

    if (d.IsVisible(By.Id("ctl00_ContentPlaceHolder1_PasswordTextBox")))

    // Click the submit button
     if (d.IsVisible(By.Id("ctl00_ContentPlaceHolder1_SubmitButton")))

    // Wait for the "StaySignedIn"-Page and disagree
         , new TimeSpan(0, 0, 60),
         e => { e.WaitForPageToLoad(); },
         f => { throw new Exception("Login page failed."); });

    if (d.IsVisible(By.Id("idBtn_Back")))

    //Wait for CRM Page to load
         , new TimeSpan(0, 0, 60),
         e => { e.WaitForPageToLoad(); },
         f => { throw new Exception("Login page failed."); });

Enjoy it and happy UI Testing even if ADFS is in use.


Automated Build of XrmToolBox Plugins with VSTS

After my 2 post about how to Build versioned XrmToolBox Plugins and how to Merge 3rd Party Assemblies into your Plugin, the next step will be to build everything automated.

My fellow and Microsoft MVP Jonas Rappen took the challenge and unveil his automated build using VSTS. Thank you very much for that.


These were my first steps into automated build and I was impressed, that I don’t have to host the code on VSTS, just to be able to get the automated build up and running. Following the instructions, I could setup my very first automated build, which of cause failed multiple times. *grrr*

I struggled with 2 thing. First of all, I was using ILMerge to include 3rd party assemblies. And second the assembly version and file version did not match the build number. Both are criteria’s, that need to be fulfilled for a trusted XrmToolBox plugin.

So far so good. How did I solved it?

ILMerge or ILRepack

I was a big fan of ILMerge, even if there are some drawbacks if you use it. But in my current situation to use it for automated builds, my build failed, when I was trying to call it with the exec MS build task.

I did some research and found ILRepack. This is a open source replacement for ILMerge which isn’t maintained since several year. The good thing is, that the exec build task was working fine here. Most probably with the same syntax it will work with ILMerge too.

I simply had to add the following NuGet Package:


This is, how I extended my project-file, to get the outcome merge repacked.

    <Target Name="ILRepack" AfterTargets="Build" Condition="'$(Configuration)' == 'Release'">
         <MakeDir Directories="$(OutputPath)Merged" />
             <MergeAssemblies Include="$(OutputPath)\Microsoft.ApplicationInsights.dll" />
             <MergeAssemblies Include="$(OutputPath)\Newtonsoft.Json.dll" />
             <ILRepackPackage Include="$(NuGetPackageRoot)\ilrepack\*\tools\ilrepack.exe" />
         <Error Condition="!Exists(@(ILRepackPackage->'%(FullPath)'))" Text="You are trying to use the ILRepack
  package, but it is not installed or at the correct location" />
         <Exec Command="@(ILRepackPackage->'%(fullpath)') /out:$(OutputPath)Merged\$(AssemblyName).dll /target:library $(OutputPath)$(AssemblyName).dll @(MergeAssemblies->'%(FullPath)', ' ')" />

As you can see, I slightly modified the sample to and also added a line to create the “Merged” folder. On release build the assemblies will now be repacked.

This solved my first issue. However I would still not pass the criteria’s for releasing the Plugin into the store. Remember the build number?

Assembly Versions

In the blog from Jonas is pretty well written, how to use the build number inside the NuGet package. However it took me a while to handle the assembly version and the file version.

First of all I had to find the right task in the marketplace. I finally ended up with the Assembly Info task, which does a great job. I simply installed it and could use it in my build definition. It supports a lot of additional parameters.

I added it in between of the NuGet restore and the Build solution **\*.sln.


In the 3 fields for all version numbers I only need to put


and it worked.


It is also possible to overwrite all other assembly attributes. You can even add the attributes, if they are not yet in your assembly, if you tick the “Insert Attributes”.



Finally I only had to finetune the Build number format in the Options. The format suggested by Jonas was inserting leading zeros on the file number format, that I didn’t wanted to have. So why I change the definition to:


The result looks the same as from Jonas, but no leading zeros anymore.

Succeeded build:


This is the *.nuspec of the release drop.


and this the reflected *.dll using Jetbrains dotPeek.


Merge 3rd-party lib into your XrmToolbox Plugin

In some cases you might need to reference additional 3rd party libraries with your plugin. In this case keep in mind, that you are sharing the plugin-folder with all the other publisher. So it is more than possible, that you are referencing the same library as someone else, but in a different version number. As soon as there are more than one library with different version number, your plugin and probably also the other one will fail.

The easiest way around that, is to merge the 3rd party library simply into your plugin dll. In this example we are using ILMerge. I know it is already pretty old and has some drawbacks. However it works and I don’t want to start the discussion about nicer things.

To proceed just open your Plugin solution and follow the instructions.

Include ILMerge via NuGet

Search and reference for the package “ILMerge.Tools”.


Modify the Project-File manually

Navigate to the folder of your project file (*.csproj) and open it in an editor of your choice.

The modification will happen after the build. The script will only run on “Release” build and take the normal output file (your plugin dll) and merge it together with your 3rd party library (in this case the Microsoft.ApplicationInsight) and put it into a “Merged” subfolder of your “Release” folder.

Since the merged folder is not there, you also need to male sure, that the folder is created if needed.

First identify the Target of type “AfterBuild”. If you don’t have, you need to create one after the last “</PropertyGroup>” before the “</Project>” closing.

Add the following block:

<Target Name="AfterBuild" Condition="'$(Configuration)' == 'Release'">
     <MergeAssemblies Include="$(OutputPath)\{YourPluginName}.dll" />
     <MergeAssemblies Include="$(OutputPath)\Microsoft.ApplicationInsights.dll" />
   <MakeDir Directories="$(OutputPath)Merged" />
   <Message Text="MERGING: @(MergeAssemblies->'%(Filename)') into $(OutputAssembly)" Importance="High" />
   <Exec Command="ilmerge /out:&quot;$(OutputAssembly)&quot; @(MergeAssemblies->'&quot;%(FullPath)&quot;', ' ')" />

Replace the placeholder {YourPluginName} with your real plugin name.

Save the project-file and reload it in your Visual Studio.

Build and Cleanup

If your build fails because ILMerge could not be found, make sure, you have the “Package Manager Console” open and it is initialized.

In the project-folder\bin\Release\Merged\ you will now find your merged dll which is slightly bigger.

Finally you have to adjust your nuspec file, since you want to use your merged assembly now instead of the plain one. So reference the one in the “Merged”-folder.



If you want to merge more 3rd party libraries, just add additional lines in the first ItemGroup like this:

<MergeAssemblies Include=”$(OutputPath)\SomeOtherAssemblyINeed.dll” />

That’s it.

Understand Virtual Entities–Part 1

Virtual entities exist for Dynamics 365 version 9.0 and that way also insider PowerApps. Basically it is more the other way around. When reading the doc.microsoft-article it gets clear, what they are used for.

You can show data from external data sources inside your Dynamics 365 without having the data stored. Please respect also the limitations.

Virtual entity diagram

Disassemble virtual entities

As we can see in the picture above, virtual entities are separated into multiple parts. You can name them:

1. External data provider

2. External data provider plugin

3. Virtual entity

Finally virtual entities are also “just” a normal entity with some additional attributes.

All starts with an External data provider. If you create one, it will generate a new entity in Dynamics 365. In this entity you can maintain all the attributes, that are needed to create a connection to your external data source. Of cause you can use the in-built OData provider. But again, review the limitations.

The External data provider plugin will be executed, when you consume the data provider with your virtual entity. The plugin can read the configuration information and is registered on Retrieve and Retrieve Multiple. But it is not a normal plugin. You need to take care about the query translation from a query expression coming from Dynamics 365 into the search against your external data service.

The last part is the virtual entity. It consumes one of your data providers. You have to maintain inside the virtual entity the external names. An external name is the name of the returned object of your data source. If you write your own adapter, you can do the mapping inside of your plugin. To be more generic, you can use the external name field on the virtual entity and it’s attributes.

Part 2 will cover, how to create your first own data provider using Plugin Registration Tool and Dynamics 365.

Build versioned XrmToolBox Plugin

The XrmToolBox is one of the greatest community driven tool for Dynamics 365 CE. It supports several Dynamics CRM versions and is OnPremise and Online capable.

There are also several Plugins available, written by open minded people. If something is missing, you can even write and publish your own Plugin to the community. This happens in several steps.

Setup new Visual Studio Project

This is already well described in this article: https://www.xrmtoolbox.com/documentation/for-developers/install-xrmtoolbox-plugin-project-template/

To be able to debug your Plugin inside the XrmToolBox, you just need to follow the following steps: https://www.xrmtoolbox.com/documentation/for-developers/debug/

Prepare your Release

Releasing your Plugin happens through NuGet. Here the documentation is lacking a little bit.

First of all you need to add the “NuGet.CommandLine” to your project file, using the inbuilt NuGet Package Manager in Visual Studio.


After this you now have to generate a nuspec-file. The benefit of generation is, that it already contains placeholder. We want to have this place-holder, so that you don’t need to maintain the release details always in two locations (assembly.cs and your projectfile.nuspec).

To generate the file, open the Package-Manager-Console through (View –> Package Manager Console). You might need to navigate into your projectfolder, where your *.csproj is located.


nuget spec

It will generate a new *.nuspec with a lot of placeholder. If it fails, close Visual Studio and reopen it. After adding NuGet.CommandLine, sometimes not all paths are set properly.

Fine-tune the *.nuspec file

Open the nuspec file and remove all placeholder, that are static with the text you want. I only left the version number as a variable like this.


Make sure you add the following into the <tags> node:

XrmToolBox Plugin %yourPluginName%

Reference your files

Be aware, that you are using the Debug builds only for debugging and testing. If you want to release, you should use the Release build. Therefor you have to point to your release output. In my case I only have the plugin dll. So it looks like this. I have to make sure, that my output is placed into the Plugins-folder on the target. My project is targeting to .Net 4.6.2.


Reference dependencies

Make sure, you add the decency to the XrmToolBox-Version you are using, when you developed your plugin.


Finally – Automate the nuget file build on release

We wanted to have the version number taken from the Assembly version of our dll. Therefor we have to modify the *.csproj file on our own. Just open it and do the following modifications.

Search for “<Target Name=”AfterBuild”>” in your current file. This should be inside a commented section, if you don’t have done any manual changes yet.

After the comment add the following section:

   <Target Name="AfterBuild" Condition="'$(Configuration)' == 'Release'">
       <GetAssemblyIdentity AssemblyFiles="$(TargetPath)">
         <Output TaskParameter="Assemblies" ItemName="myAssemblyInfo" />
       <Exec Command="nuget pack .\CRMP.XTBPlugin.SystemComparer.nuspec -Properties version=$([System.Version]::Parse(%(myAssemblyInfo.Version)).ToString(4))" />

This section will now always execute, after you build your project in Release configuration only.

GetAssemblyIdentity retrieved the assembly information of the output. In our case it is the dll. So we can access the information we have entered into our assembly.cs file.

The second line will create the nupkg-file we need and injects the assembly version from our assembly into the package. So we only have to change the version number once in our project using the assembly version number.


Virtual Entities–Brief Overview

Virtual entities is a “new” way to integrate external data sources into Dynamics 365. They have been introduced in version 9.0. In my opinion you can find them somewhere between mystery and enlightment.

Since they are not that easy to handle, I just summarize, how they work and how you can use them.

Official documentation

If you browse the internet, you will find quickly the official documentation on docs.microsoft.com. This gives you a brief understanding on the technology.

A virtual entity is handled as a normal entity, which limited to read operations only. Instead of executing queries against the internal Organization database, it will use  a data provider to communicate with an external data source.

Out of the box there is only an OData adapter available, which also has strict limitations. But you can write your own data provider.

How does it work

Technically the data provider is based on already existing functionality. In detail there are two Plugins behind the scenes. One is running on the Retrieve method and the second on RetrieveMultiple.

If you have a closer look on the entities in Dynamics 365, you can find some new fields on the entity when you customize them.


You can mark an entity on create to become a virtual entity. This cannot be changed later. You have to select the data source and define the external names. On attribute level you will also find the external name field. They are only editable if the entity is of type virtual entity.

They are used as mapping internal field (Dynamics 365) to “external fields (external Source).

As soon as you query the virtual entity, the dynamics server (respect IP ranges), will connect your external data source. Doing this allows you to see virtual entities also on your mobile client.

Architectural idea

In the past it was pretty common to create interface and synchronize data between 2 systems. Nowadays this is getting more and more unhandy. There are so many services and application to connect and the user need current and correct data. If all goes with old-school integrations, IT would be fully overloaded with supporting interfaces and making sure the data is in sync on both systems.

Virtual entities is a sweet solution to show data of a remote system in real time. Of cause this is only read only, but the benefit is, that the data is always in sync.


  • Currently only OData is supported without development.
  • Entities are read only (which is fine for this kind of integration)
  • You cannot use Workflows running on Create or Update, due to this events are not available. Same is true for rollups and calculated fields
  • Audit is also not supported (remember you are only consumer)
  • It cannot be an activity
  • It will not support offline capability, global search and even n:n between virtual entities
  • Virtual entities are always organization owned. So it supports the standard security role concept. Field security is not available.
  • Recommendation is to only target a single virtual entity in advanced find


After we had the most limitations, lets have a look what is supported:

  • Advanced find is fully supported
  • List and form view
  • SSRS reports
  • Charts
  • Mobile Clients

Summary and personal thoughts

Virtual entities are the right step to integrate external data sources into Dynamics 365, which you only need for read only purposes. Keep in mind, that the Dynamics 365 Server will execute the query against the external data source. This mean as higher the latency, as longer the loading behavior on user side will be.

If you integrate, make sure, your service is fast and close to the dynamics region. If you can, host it in Azure in the same regional data center.

The next article will focus on relationships with virtual entities and custom data providers. In the meantime integrate Chuck Norris Jokes and follow the steps from Jason Lattimer.