RM for VS2013 : Increase TFS deployment timeout

Today I saw that the TFS build had triggered the ReleaseManagement client but that it had failed. A closer look at it told me that TFS thought it had been waiting long enough and decided that the deployment failed. So I went to the ReleaseManagement client and tried to increase the timeout for TFS Triggered deployments. But the maximum value allowed in the client is 99 minutes. Yes… 99 minutes so clearly my AX deployment will take longer due to full compile, report deployment, portal deployment, …























To get around this, you can skip the client and dive into the SQL database of ReleaseManagement. There is a table called SystemSettings which contains all of the RM settings. To get the timeout above the 99 minutes, just update the DeploymentStatusCheckMaxWaitInMinutes field with the value you need here.


UPDATE [ReleaseManagement].[dbo].[SystemSettings]
SET    DeploymentStatusCheckMaxWaitInMinutes = 180

Reopen the client and there you have it.



Release Management for VS2013 … and AX : The first tool

In part 1 of this series, we discussed the pro and cons of Release Management for VS2013 and scratched the surface of how we are going to use it with Dynamics AX. As that post has been there for a while, new updates have been available for Release Management (RM from now on) adding some interesting features like deployment agent-less deployment using Desired State Configuration. But anyway, I have been working with RM lately together with my colleague Kevin Roos to automate deployment of daily TFS build drops to a test Dynamics AX 2012 environment. Today I want to start scratching the surface and show you how to create a first tool within RM to clean the AX artifact folders. It is one of the simplest tools that we are going to use but it shows the following:

  • Creating a powershell module that contains generic functions to use with Dynamics AX (The module also takes care of loading other libraries like the ones from Joris De Gruyter)
  • Creating a powershell script to act as a bridge between RM and the PS module
  • Creating a tool and corresponding action to use in RM

So the goal here is to have a tool / action in RM that removes AX artifacts. To achieve this, we use a PowerShell script representing the clean artifacts tool and that script will in it’s turn make use of a generic PowerShell module that contains a library of functions to be used with Dynamics AX 2012.

 Creating the powershell module

First things first, before wiring up RM, let’s create a PowerShell module to contain all kinds of functions to be used with AX. (Cleaning of folders, reading configuration files, starting compilations, …). A good introduction to PowerShell modules can be found on the SimpleTalk blog article : An Introduction to PowerShell modules.

So for our module here, we created a default model manifest and added the following stuff:

RequiredAssemblies = @('CodeCrib.AX.Config.dll', 'CodeCrib.AX.AXBuild.dll', 'CodeCrib.AX.Client.dll')
ModuleToProcess = @('RDAXManagement.psm1')
NestedModules = @('CodeCrib.AX.Config.PowerShell.dll', 'RDAX.CodeCribWrapper.dll')

The RequiredAssemblies line makes sure that the CodeCrib dll files we use are loaded when we load our module. ModuleToProcess shows the module where this definition applies to. And finally, NestedModules will load additional Cmdlets available in the specified dll files.

So, our definition file is ready, let’s create the module file itself. So create a PowerShell file named RDAXManagement.psm1 and put in the following function:

function Clear-AXArtifactFolders
	[Parameter(	Position    = 0, 
			Mandatory   = $true)]
        [Parameter(	Position   = 1, 
			Mandatory  = $true)]
        # Read the client configration file
        $serverConfiguration = Get-ServerConfiguration -filename $serverConfigName
	$serverAltBinDir = $serverConfiguration.AlternateBinDirectory
	$LocalAppDataFolder = [Environment]::GetFolderPath('LocalApplicationData')
	$FolderPath = [string]::Format("{0}Application\Appl\Standard\*",$ServerAltBinDir)
	Write-Host "Cleaning server label artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Include ax*.al? -Recurse
	$FolderPath = [string]::Format("{0}XppIL\*",$ServerAltBinDir)
	Write-Host "Cleaning server XppIL artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Include *.* -Recurse
	$FolderPath = [string]::Format("{0}VSAssemblies\*",$ServerAltBinDir)
	Write-Host "Cleaning server VSAssemblies artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Include *.* -Recurse
	$FolderPath = [string]::Format("{0}\*", $LocalAppDataFolder)
	Write-Host "Cleaning client cache artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Include .auc,kti
	$FolderPath = [string]::Format("{0}\Microsoft\Dynamics Ax\*", $LocalAppDataFolder)
	Write-Host "Cleaning client VSAssemblies artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Filter "VSAssemblies"

So there we have the first function of our generic library available. It’s time to create a wrapper script that we can attach to a tool within the RM client.

Creating the PowerShell RM tool wrapper

There is a good reason why we use wrapper scripts that are linked to RM tools. By creating several tools, the parameter list passed to a specific tool is smaller than creating one big script having to deal with all of the possible actions/parameter combinations to feed the RDAXManagement PowerShell module. And trust me, some of the tools coming up later already have a bunch of parameter sets to deal with things.

But anyway, create a PowerShell script called RDAXFolderCleaner.ps1 and put in the following code:

< #
	Cleans up the Ax artifacts
	This script will delete cache files, xppil files, label caches, ... from the Ax folders
.PARAMETER AxClientConfigName
	The client configuration to load client side folder metadata
.PARAMETER AxServerConfigName
	The server configuration to load server side folder metadata
    [string]$AxClientConfigName = $(throw "The client configuration file name must be provided."),
    [string]$AxServerConfigName = $(throw "The server configuration file name must be provided.")
# This makes sure that ReleaseManagement also fails the step instead of continuing execution
$ErrorActionPreference = "Stop"
$ExitCode = 0
	# Determine the working directory of the deploy agent
	$InvocationDir 		= Split-Path $MyInvocation.MyCommand.Path
	# Load the Dynamics AX Management PowerShell module
	$AXModuleFileName 	= [string]::Format("{0}\RDAXManagement.psd1", $InvocationDir);
	Import-Module -Name $AXModuleFileName
	# Call the cleanup routine
	Clear-AXArtifactFolders -clientConfigName $AxClientConfigName -serverConfigName $AxServerConfigName
	"`nCleaning of artifacts finished."
Catch [system.exception]
	$ErrorMessage = $_.Exception.Message
	"Error : RM Cleaning of folder failed. Exception message: $ErrorMessage"	
	$ExitCode = 1
exit $ExitCode

So to summarize the above code, the script takes the needed parameters from RM for this specific action and passes them AX generic PowerShell module. Apart from the business logic, the most important line is the one telling PowerShell to stop when an error is encountered. If you do not put this line in there, PowerShell will pop up errors in the log, but RM will just keep on thinking the steps in the release sequence finish successful.

Creating the RM tool and corresponding action

Now that we have the business logic in the PowerShell module and a script that acts as a bridge between RM and the module, let’s create a tool and action for it in RM itself. Tools are used to group several actions that can be done with one and the same tool. In RM, go to the tab page Inventory, select Tools and create a new tool. Then you can setup the tool as in the picture below.


As you can see the tool will invoke PowerShell and pass the folder cleaner tool script. Parameters can be specified by using the double underscores ‘__’ before and after the parameter name. By doing this, the parameters will become available in the workflow shapes later on.

Now we’re almost there, the only thing remaining now is adding the needed resources for the tool. When the Deployment Agent needs to run the tool, it will need the PowerShell script, our module and the related dll files that are used. By adding these references in the bottom part of the tool screen, they get deployed on the Deployment Agent when running the tool. This is actually the easiest way to get the PowerShell scripts of the standard scripts available in RM since you cannot save a local copy of those in the RM client.


So there we have our first tool. The only thing left to do now in this already grossing blog post is creating an action we can use within our deployment sequences. Next to the tools seciont, you can find the Actions tab. Now create a new action as seen below. Notice that when you select the tool this action uses, the command is copied from the tool and can be modified for this action. This is convenient when using the same tool for multiple actions that require different parameter sets.


With all of the in place, you can start using this tool in deployment sequences.


In the following blog posts, we will discuss the Release Paths / Environments and get closer to the actual Release Templates. And in between, Tools and Actions for dealing with custom Dynamics AX stuff will pop up as we go.

Release Management for VS2013 … and AX : Part 1

In a previous post I mentioned Release Management for Visual Studio 2013 but I didn’t take the time to go through the capabilities and why I was using it. We have spent some time on creating automated builds for AX and these are up and running for while now using the code activities from Joris De Gruyter. Right now I am working together with my colleague Kevin Roos to investigate more automated deployments with AX2012 R3. Along the road, I hope to find the time to write some posts that provide you with some insights along the way. This post is the first of them.

So the first thing we thought of, is deciding which toolset we would be using. We didn’t want to invent hot water again and yet, most of the things available have their limitations. So a few brainstorm sessions later, we decided to go for the following set of tools:

Release Management for Visual Studio 2013

Formerly known as InRelease, Microsoft bought this solution from InCycle and incorporated this with the Visual Studio 2013 stack and named it Release Management for Visual Studio 2013. RM helps you to deploy your solutions accross the different stages in their lifecycle.

The pros are:

  • The tool enables the definition of environments, servers, environment stages to work on
  • A workflow based way of deploying between environment stages
  • The ability to create custom tools be used within deployments
  • Template designer to easily drag and drop servers, actions, … in your release templates.
  • Tight integration with TFS Builds (Deploy build outputs, trigger deployments from successful builds, …)

The cons:

  • The first and farmost the biggest downside of RM is the fact that the tools and actions used within release templates are not using the same type of activities as the TFS build CodeAcitivity types.
    It would be nice if Microsoft would add the possibility to use code activities as they are working on the product now. Then it would be possible to work on code activities that can be reused for both TFS Builds and RM Releases. (And we could again make use of the code activities Joris created)
  • The second con might only be clear to people already using RM. The RM client could use a more user friendly way of working your way around the different sections. I know this is not a big deal, but let me give an example of this. When deleting an action, you need to go through all of the templates manually to remove it before you can actually remove the action / tool. A reference tool or cascade remove would be nice :)

DynamicsAxAdmin sources (from Joris’ CodePlex)

Joris De Gruyter has created a set of code activities that can be used as TFS Build Activities and along with those activities, he has a whole set of tools and plumbing code that can be reused. And the most important thing here for us are the PowerShell CmdLets that are also available in his solution on codeplex. An good example of a useful CmdLet is the Get-ClientConfiguration CmdLet which will try to find the client configuration via registry, file, … and get all of the details like binary folders, installation directory, and so on. So along the way, we will try to reuse some of those CmdLets.

Dynamics Ax 2012 Management utilities

The management utilities will also be used to reuse the CmdLets from Microsoft to get stuff done in the model store, deploying reports, deploying portal, …

PowerShell module and ‘wrapper’ scripts

To do most of our own plumbing, PowerShell will be used. The idea is to create a PowerShell module that contains reusable functions to work with Dynamics Ax. This module will be used by the various tools and actions that we will within RM. Next to the module, we will create a few tools within RM and each of these tools will have a ‘wrapper’ PowerShell scripts that will look at the different actions that can be performed with that tool and the parameters used for these various actions. The reason for splitting into different tools is easier management from within RM and cleaner PowerShell files with seperation of concerns.

So all of the above results in the following overview.


Let’s take a look at the cleaning of AX artifact folders  to understand a bit more of what is going on there. The idea is to create a tool that  can be used by several actions. For the cleaning of artifact folders, we only have a 1-1 relationship between Actions and Tools. The AX Folder cleaner tool actually runs powershell and uses attached resources (powershell files) to do the work. All of the actions that use the tool can use these same resources but have a different command line to run them. More technical details on what is going on exactly will follow in the next post.

Failed to create a session revisited

Today I was experiencing this error again when working a new Hyper-V with imported demo data in AX 2012 R3:

Failed to create a session; confirm that the user has the proper privileges to log on to Microsoft Dynamics

In a previous post, I stated a solution for this, but today, the script in that post didn’t cut it. And the reason is that there are more partitions in it now. So I wanted to tell you a bit more on how to solve it.

The cause of the issue remains in the UserInfo table. But to show you, first start up a client in the Initial partition and open the partition form.


When we add a partition here, there is some stuff going on in the background. The AOS will also create new user entries in the UserInfo table for the ‘Admin’ and ‘guest’ users. When you modify the client config to log on to the new partition, it will work and you can run the partition initialization checklist.



But, when importing demo data using the Data Transfer Tool, partitions are imported, but the background procedure of adding the Admin and Guest users to the UserInfo table is not executed. So the real problem is that the UserInfo table does not contain the Admin and Guest records for each of the partitions. So you need to make sure that for every partition available there is a matching pair of records in the UserInfo table. (In my case I created the records and played with the recId’s, but be careful with that.)

ListPageInteraction lookup

Small post and to the point :)

I wanted to point out a small detail that I was forgotten today in AX2012. When you want to add custom logic to a ListPage form, you might want to write your own custom ListPageInteraction class. So what you need to is just create a class that extends from the ListPageInteraction class and then you can select it in the combo box on the corresponding form, right? Well the answer is no!

There’s a small detail here that might be missed. The kernel does not only look for classes that extend ListPageInteraction, but your classname also needs to end with ‘ListPageInteraction’. So when you add a class and name it so long that you have to shortnen it (MyTooLongClassNameForAListPInteract) it won’t show up in the list! Rename it to MyListPageInteraction and it shows up in the dialog!

Windows Azure : Copy VHD between subscriptions

Last week I was playing around with LifeCycle Services and tested out the ‘Cloud hosted environments’ feature. This is an awesome feature that will save many of you some hours in prepping demo virtual machines. (For now, it is only able to deploy a template environment suited for demo purposes, but more templates will be available later on)

But, it didn’t take long until I discovered my first ‘missing feature’ here… namely I was stupid enough to link the wrong subscription to the LCS project and I ended up adding my pay-as-you-go subscription instead of the one with acual credit on it :| Soon after that I realized there is no option to unlink it again. So there you have it, the LCS project and one of my subscriptions are married to one another now!

So I had two options here:

  • Delete the VM in Azure and redeploy a new one from LCS
  • Copy the VM from one subscription to another.

The first one was the easy one, so to have a bit of a challenge I went and chose the second option. Let’s write some PowerShell stuff to get things copied from one subscription to another! Here is how you have to do just that:

Install the Azure Powershell module

You can download and install the Azure PowerShell module by running the Microsoft Web Platform Installer. When finished you can fire it up in Windows 8 by just typing “Power” at the home screen and you will see it right away :).

Get you machine familiar with your Azure subscriptions

To be able to run the powershell commands in the next part, you will first need to import your subscription settings on your machine. Your first command here will be:


When running this cmdlet, your browser will pop up and take you to the correct link to download the publish settings files. When you have downloaded the files, run the following cmdlet to import it on your machine:

Import-AzurePublishSettingsFile "Path\subscriptionname.publishsettings"

Copy the VHD disks between subscriptions

Now that we are all set up, it’s time to start copying the virtual disks between the source and target subscription. Use the following script to do so.

### This assumes you have already imported the azure publishsettings file and the subscriptions are known on your machine.
Select-AzureSubscription "MyCoolSubscriptionName" 
### Defines the location of the VHD we want to copy
$sourceVhdUri = "https://mysourcestorageaccountname.blob.core.windows.net/omitted_part_of_the_uri/AX2012R3_DEMO_OS.vhd"
### Defines the source Storage Account
$sourceStorageAccountName = "mysourcestorageaccountname"
$sourceStorageKey = "mywaytolongsourcestoragekey"
### Defines the target Storage Account
$destStorageAccountName = "mytargetstorageaccountname"
$destStorageKey = "myalsowaytolongtargetstoragekey"
### Create a context for the source storage account
$sourceContext = New-AzureStorageContext  –StorageAccountName $sourceStorageAccountName `
                                        -StorageAccountKey $sourceStorageKey  
### Create a context for the target storage account
$destContext = New-AzureStorageContext  –StorageAccountName $destStorageAccountName `
                                        -StorageAccountKey $destStorageKey  
### Name for the destination storage container
$containerName = "dynamicsDeployments"
### Create a new container in the destination storage
New-AzureStorageContainer -Name $containerName -Context $destContext 
### Start the async copy operation
$blob1 = Start-AzureStorageBlobCopy -srcUri $srcUri `
                                    -SrcContext $srcContext `
                                    -DestContainer $containerName `
                                    -DestBlob "AX2012R3_DEMO_OS.vhd" `
                                    -DestContext $destContext
### Get the status of the copy operation
$status = $blob1 | Get-AzureStorageBlobCopyState 
### Output the status every 10 seconds until it is finished                                    
While($status.Status -eq "Pending"){
  $status = $blob1 | Get-AzureStorageBlobCopyState 
  Start-Sleep 10

If you filled in the specifics of your own subscriptions, storage accounts, … you should see the following output while copying.

CopyAzureVHDCopy the virtual machine between subscriptions

Now that have transferred the virtual disks between subscriptions, we still have to copy the virtual machine between the two. And to do that, there is a nice script available on TechNet Script Center : Copy a Virtual Machine between subcriptions

So there you have it, your VM is now copied to a second subscription and if you like, you can just remove it from the source subscription.


Release Management – Customize timouts on default tools / actions

In Release Management for Visual Studio 2013, you can customize your tools and actions to work with. The last few days I was creating a Release template for use with Dynamics AX 2012. One of the actions early in the template is backing up the database. ( Of course you need to backup first :-) ) And there I faced a little inconvenience: I could not adjust the timeout of the restore action but this was needed as an AX model store database restore could not be completed within 5 minutes.

So I decided to go back to the Restore SQL Database action and adjust the timeout setting. But hey, you are unable to do that on the default tool!


There is a parameter to setup default timeouts, but this only applies when creating new actions. So that doesn’t help a lot.


So for me, the most convenient and simple solution was to copy the tool that is available out of the box and create a new one from it. Then you do have the ability to set the timeout.




The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.

Today I experienced this problem while importing models from the build machine into one of the developer machines:

The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.

A quick search on the internet took me to this blog post that explains the importance of initializing / updating your model store after installing CU’s. To initialize/update your model store you simply have to run the AxUtil with the schema command. (Alternative for powershell is Initialize-AXModelStore)

For a full list of AXUtil and PowerShell commands, please refer to MSDN.

Announcing Microsoft Dynamics AX 2012 R2 Services

Hello everybody!

I’m very happy to announce Microsoft Dynamics AX 2012 R2 Services, published byPackt Publishing and authored by Klaas and myself. It is the successor of Microsoft Dynamics AX 2012 Services which was very well received. Based on the feedback you had and based on the new R2 version, we started working on an update in September 2013. We have gone through all existing content, updated and improved it, and added a lot of new content. We are really happy with the result and are excited to hear your reactions.

We are working hard on the book at the moment, the expected release date is May 2014 but you can already pre-order it.

There are also a couple of giveaways that can be won! More information can be found in the full blog post created by Klaas.



The Registry.CurrentUser impediment

Today I was working on creating a build process template for AX2012. I will not go into the details about the build process template and the custom activities as these activities will be covered by Joris De Gruyter later on. But still, I wanted to show you guys a nasty little problem I was experiencing yesterday.

The build process threw an exception : ” Object reference not set to an instance of an object

So, this was of course a null reference, but after some investigation ( by using primitive logging because this was running in the agent process :-) ) it turned out to be located in this code:

public static string GetInstallDir32()
    RegistryKey componentsKey = Registry.CurrentUser.OpenSubKey(regKey, false);
    String installDir32 = componentsKey.GetValue("InstallDir32", null) as String;
    return System.Environment.ExpandEnvironmentVariables(installDir32);

It turned out to be the Registry.CurrentUser property that returned null. And then it struck me, the agent ran under a service account s_tfs_build and since this account was never used to log on interactively, the profile of this user was not loaded up to that point.

The simple solution was to log on to the system with the service account so that the user profile and registry hive were loaded. From then on, the Registry.CurrentUser property was properly filled.