CombineXpos.exe Length cannot be less than zero

When I was using the combineXPOs.exe executable the other day on a new project, I ran into an issue. The error message thrown by the tool was the following:


CombineXPOs.exe : Warning [-5] CombineXPOs.exe : Error [<pathToXpo>] Length cannot be less than zero.


I remembered that Martin Dráb ran into the same error previously so I went to take a look at his blog post on this type of error. In his post, he explains this might be because of empty / corrupted XPO files and indeed, this error pops up when you either have an empty XPO file or a (corrupted) shared project node that contains no objects.

Removing 2 of 6000+ XPO files that turned out to be empty indeed helped and the tool got further in the list of XPO files. But wait, it still gave the same error message?! So what was different here? I saw that the error code being returned was not -4, but -5. And I did not have empty XPO files anymore. And on top of that, now it was the directory with all the Shared Project XPO files that returned errors.
I couldn’t see right away what the issue was here, so I decided to peek into the executable and reverse engineer a C# project out of it. The I stepped through the process of combining a single file and managed to see what was going on. Somewhere inside there, there is an XPOReader class that opens the XPO files and looks at the encoding of the files.

if (fileStream.Length > 3L && (int)numArray[0] == 239 && ((int)numArray[1] == 187 && (int)numArray[2] == 191))
    this.fileEncoding = (Encoding)this.utf8Encoding;
    num1 = 1;
    this.fileEncoding = Encoding.GetEncoding(1252);

Looking at the code, it turns out that it looks for UTF8 files, and if that is not the case, assumes it has to be CP1252 (Latin 1). What if my files have another encoding? And looking at the files, they turned out to be LCS-2 LE BOM. Until now, I haven’t looked into it why these project XPO files have a different encoding. To solve the issue in this case, a bit of additional code makes sure the XPO reader can correctly read the XPO through codepage 1200.


if (fileStream.Length > 3L && (int)numArray[0] == 239 && ((int)numArray[1] == 187 && (int)numArray[2] == 191))
    this.fileEncoding = (Encoding)this.utf8Encoding;
    num1 = 1;
    // KeSae : Check for codepage UCS-2 LE
    if (numArray[0] == 0xFF && numArray[1] == 0xFE)
        this.fileEncoding = Encoding.GetEncoding(1200);
        this.fileEncoding = Encoding.GetEncoding(1252);

Rebuilding the project resulted in a customized CombineXPOs.exe executable that correctly handles the shared project nodes in

How we do AX releases – Part 1

“Good release management practices ensure that when your software is built, it will be successfully deployed to the people who want to use it. You have the opportunity to satisfy existing customers and hopefully to win new ones.”  It’s a statement coming from a excellent article dicussing 7 ways to improve your releases. Before continuing here, make sure you read the article entirely. It really deals with all of the facets of releases, starting from planning to infrastructure and automation of the process.


Let’s elaborate on a few of the topics in the article and explain how this series will provide a way of doing this for Microsoft Dynamics Ax releases (And remember: It will be A way to do it, not THE way):


  • Plan your releases!
    Captain obvious to the rescue.. But let’s be honest, how many of you technicals are often forced into a situation where an ad hoc release is asked…? So it is of the utmost importance to establish a regular release cycle. That way, all of the stakeholders have a clear view on the timetable and when to expect to see progress. Your release cycle is not about when your customer wants the release. It’s about when you can deliver it to the desired level of quality.One thing we have done to support planning releases and follow up the time spent on releases, is creating a custom work item type ‘Release’. On that custom work item type, we can put all the details concerning the release (Title, Original Estimate, Actuals, linked queries to be able to query for the work items related to this release)


  • Establish your release infrastructure early
    For Dynamics projects, this is even more important. Especially because a full Microsoft Dynamics AX environment requires quite some resources. This is one of the most difficult topics to discuss when it comes to figures. Having all of the needed environments (DEV, QA, UAT, STAGING, MIGRATION, PRODUCTION) will make you expensive as opposed to other vendors stating that a DEV, QA and PRD are enough. But in the end, the customer will notice during the implementation that he does need those to do things as they should be done.


  • Release notes are not optional!
    And yet I still come accross projects where releases are done without informing the customer what has actually been changed. And it’s all about perception here. You do not want all of the bugs to be on the notes, but the ones reported by the customer do need to be mentioned. That way, customers have the notion of progress and key users can be informed about the changes.To achieve this, we have done two things. First we have added a field Release note (No / Yes) to the work item types available on release notes. This way we can actually filter out the things we do not want on the notes. Secondly, we created a Visual Studio Team Explorer add-in to enable us to generate release notes based on a selected work item of the type Release. Through the use of the OpenXML SDK, we generate a word document containing a couple of sections and each section is linked to a work item query.


In the next post of this series, we will take a closer look at the custom work item type ‘Release’.

RM for VS2013 : Creating the deployment sequence

This is the third post in a series discussing automated releases for Dynamics AX 2012 with ReleaseManagement for Visual Studio 2013. Before, we discussed the following topics:

  • Part 1 discussed the pros and cons of ReleaseManagement and gave a glimpse on how we would fit in Dynamics AX.
  • With part 2 , a first tool was created. This gave an overview of the central PowerShell module and explained the use of tool wrappers.

In this post, I wanted to tell you all about setting up servers, environments, paths and so on, but this is already well documented in the user’s guide. Also for AX, this does not differ from other products. ( (Sorry for the links in here, but WordPress is bugging me with some issues in the editor :))


Instead, let’s focus on the AX specific parts. More specifically:

  • Component configuration
  • Release Template configuration
  • Deployment sequence

Component configuration

Components can be seen as logical parts of a solution. This can be a database, a website, … These components can be configured in RM and you can assign deployment tools to them, link them to sources, … For Dynamics AX, we state that the component is comprised of the set of models or model store in the build output.

So for us, the configuration of an ‘AX Component’ is pretty straight forward. We create a component “AX2012Solution” and there are 2 things to configure: Where are your component sources and how do we deploy these source to the agents?


First, we define where the component sources are located. There are 3 options here:

  • Builds with application
    This links the component to a TFS Build definition defined in the release template. By doing so, RM knows what build drops to take and use these as the source for your deployment. So different release templates can be attached to different definitions. We use this when branching environments.
  • Builds independently
    In short, this is the same as the first option, but the build definition is linked to the component and not to the release template.
  • Builds externally
    Use this when you have no access to TFS from you deployment machines. You are able to have ‘offline’ source available in a folder.

In our case , this is linked to TFS through the release template.


Next, the Deployment tab is up. Here we’ll tell RM how to deploy the selected resources to the deployment agents (the machines involved in your deployment). And for AX, we can use the XCopy deployer here. Why?? Because we can use it to simply copy the sources from the build output folder to a known variable (__installationPath__) within RM. From then on, we can use that variable in our release template.


Release Template configuration

Next, you need to create a release template. This is when all of the pieces fall together. Start of by selecting the properties of your template. There you can specify the release path to be used. All of the stages in your path will result in a tab on your template. So it is possible to have a different deployment sequence for the different stages.

The second important thing here is the connection to the TFS Build definition. When you specify a build definition, RM will be able to present a list of recent builds when starting a new release.


Deployment sequence

And now, the magic of the whole thing! Putting together a sequence of actions that define your release. In the sequence editor you can see the following:

  • At the left side, the toolbar containing your servers, components and inventory (tooling)
  • Top right, you can see a tab for each of you stages in the release path. (To be able to have deployment template per stage)
  • In the main area, you can drag and drop your action to the canvas. Depending on the action, you can input the parameter values needed.

So you can start of by dragging actions and composing your deployment. There a few rules here to keep in mind.

  • Actions run on a certain server, so you have to start dragging the machine you want to use first.
  • Your component is also available in the toolbox, so you can use your component shape to copy the build sources to the corresponding server.

Below, you can see a part of the deployment sequence and how it all falls together. The Install AX model action will collect parameters and pass them on to the PowerShell wrapper tool. From there on, the central PowerShell module is called and parameters are passed.



 There you go! I hope this has shed some light on how to configure you deployment sequence for AX. In the next post, we will take a look at starting a release and how we can follow up deployments. After that, we should also take a look into the TFS Build template to trigger RM releases from within TFS Builds.


RM for VS2013 : Increase TFS deployment timeout

Today I saw that the TFS build had triggered the ReleaseManagement client but that it had failed. A closer look at it told me that TFS thought it had been waiting long enough and decided that the deployment failed. So I went to the ReleaseManagement client and tried to increase the timeout for TFS Triggered deployments. But the maximum value allowed in the client is 99 minutes. Yes… 99 minutes so clearly my AX deployment will take longer due to full compile, report deployment, portal deployment, …























To get around this, you can skip the client and dive into the SQL database of ReleaseManagement. There is a table called SystemSettings which contains all of the RM settings. To get the timeout above the 99 minutes, just update the DeploymentStatusCheckMaxWaitInMinutes field with the value you need here.


UPDATE [ReleaseManagement].[dbo].[SystemSettings]
SET    DeploymentStatusCheckMaxWaitInMinutes = 180

Reopen the client and there you have it.



Release Management for VS2013 … and AX : The first tool

In part 1 of this series, we discussed the pro and cons of Release Management for VS2013 and scratched the surface of how we are going to use it with Dynamics AX. As that post has been there for a while, new updates have been available for Release Management (RM from now on) adding some interesting features like deployment agent-less deployment using Desired State Configuration. But anyway, I have been working with RM lately together with my colleague Kevin Roos to automate deployment of daily TFS build drops to a test Dynamics AX 2012 environment. Today I want to start scratching the surface and show you how to create a first tool within RM to clean the AX artifact folders. It is one of the simplest tools that we are going to use but it shows the following:

  • Creating a powershell module that contains generic functions to use with Dynamics AX (The module also takes care of loading other libraries like the ones from Joris De Gruyter)
  • Creating a powershell script to act as a bridge between RM and the PS module
  • Creating a tool and corresponding action to use in RM

So the goal here is to have a tool / action in RM that removes AX artifacts. To achieve this, we use a PowerShell script representing the clean artifacts tool and that script will in it’s turn make use of a generic PowerShell module that contains a library of functions to be used with Dynamics AX 2012.

 Creating the powershell module

First things first, before wiring up RM, let’s create a PowerShell module to contain all kinds of functions to be used with AX. (Cleaning of folders, reading configuration files, starting compilations, …). A good introduction to PowerShell modules can be found on the SimpleTalk blog article : An Introduction to PowerShell modules.

So for our module here, we created a default model manifest and added the following stuff:

RequiredAssemblies = @('CodeCrib.AX.Config.dll', 'CodeCrib.AX.AXBuild.dll', 'CodeCrib.AX.Client.dll')
ModuleToProcess = @('RDAXManagement.psm1')
NestedModules = @('CodeCrib.AX.Config.PowerShell.dll', 'RDAX.CodeCribWrapper.dll')

The RequiredAssemblies line makes sure that the CodeCrib dll files we use are loaded when we load our module. ModuleToProcess shows the module where this definition applies to. And finally, NestedModules will load additional Cmdlets available in the specified dll files.

So, our definition file is ready, let’s create the module file itself. So create a PowerShell file named RDAXManagement.psm1 and put in the following function:

function Clear-AXArtifactFolders
	[Parameter(	Position    = 0, 
			Mandatory   = $true)]
        [Parameter(	Position   = 1, 
			Mandatory  = $true)]

        # Read the client configration file
        $serverConfiguration = Get-ServerConfiguration -filename $serverConfigName
	$serverAltBinDir = $serverConfiguration.AlternateBinDirectory

	$LocalAppDataFolder = [Environment]::GetFolderPath('LocalApplicationData')
	$FolderPath = [string]::Format("{0}Application\Appl\Standard\*",$ServerAltBinDir)
	Write-Host "Cleaning server label artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Include ax*.al? -Recurse
	$FolderPath = [string]::Format("{0}XppIL\*",$ServerAltBinDir)
	Write-Host "Cleaning server XppIL artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Include *.* -Recurse
	$FolderPath = [string]::Format("{0}VSAssemblies\*",$ServerAltBinDir)
	Write-Host "Cleaning server VSAssemblies artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Include *.* -Recurse

	$FolderPath = [string]::Format("{0}\*", $LocalAppDataFolder)
	Write-Host "Cleaning client cache artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Include .auc,kti

	$FolderPath = [string]::Format("{0}\Microsoft\Dynamics Ax\*", $LocalAppDataFolder)
	Write-Host "Cleaning client VSAssemblies artifacts ($FolderPath)"
	Remove-Item -Path $FolderPath -Filter "VSAssemblies"

So there we have the first function of our generic library available. It’s time to create a wrapper script that we can attach to a tool within the RM client.

Creating the PowerShell RM tool wrapper

There is a good reason why we use wrapper scripts that are linked to RM tools. By creating several tools, the parameter list passed to a specific tool is smaller than creating one big script having to deal with all of the possible actions/parameter combinations to feed the RDAXManagement PowerShell module. And trust me, some of the tools coming up later already have a bunch of parameter sets to deal with things.

But anyway, create a PowerShell script called RDAXFolderCleaner.ps1 and put in the following code:

< #
	Cleans up the Ax artifacts
	This script will delete cache files, xppil files, label caches, ... from the Ax folders
.PARAMETER AxClientConfigName
	The client configuration to load client side folder metadata
.PARAMETER AxServerConfigName
	The server configuration to load server side folder metadata

    [string]$AxClientConfigName = $(throw "The client configuration file name must be provided."),
    [string]$AxServerConfigName = $(throw "The server configuration file name must be provided.")

# This makes sure that ReleaseManagement also fails the step instead of continuing execution
$ErrorActionPreference = "Stop"


$ExitCode = 0

	# Determine the working directory of the deploy agent
	$InvocationDir 		= Split-Path $MyInvocation.MyCommand.Path

	# Load the Dynamics AX Management PowerShell module
	$AXModuleFileName 	= [string]::Format("{0}\RDAXManagement.psd1", $InvocationDir);
	Import-Module -Name $AXModuleFileName

	# Call the cleanup routine
	Clear-AXArtifactFolders -clientConfigName $AxClientConfigName -serverConfigName $AxServerConfigName

	"`nCleaning of artifacts finished."
Catch [system.exception]
	$ErrorMessage = $_.Exception.Message
	"Error : RM Cleaning of folder failed. Exception message: $ErrorMessage"	
	$ExitCode = 1

exit $ExitCode

So to summarize the above code, the script takes the needed parameters from RM for this specific action and passes them AX generic PowerShell module. Apart from the business logic, the most important line is the one telling PowerShell to stop when an error is encountered. If you do not put this line in there, PowerShell will pop up errors in the log, but RM will just keep on thinking the steps in the release sequence finish successful.

Creating the RM tool and corresponding action

Now that we have the business logic in the PowerShell module and a script that acts as a bridge between RM and the module, let’s create a tool and action for it in RM itself. Tools are used to group several actions that can be done with one and the same tool. In RM, go to the tab page Inventory, select Tools and create a new tool. Then you can setup the tool as in the picture below.


As you can see the tool will invoke PowerShell and pass the folder cleaner tool script. Parameters can be specified by using the double underscores ‘__’ before and after the parameter name. By doing this, the parameters will become available in the workflow shapes later on.

Now we’re almost there, the only thing remaining now is adding the needed resources for the tool. When the Deployment Agent needs to run the tool, it will need the PowerShell script, our module and the related dll files that are used. By adding these references in the bottom part of the tool screen, they get deployed on the Deployment Agent when running the tool. This is actually the easiest way to get the PowerShell scripts of the standard scripts available in RM since you cannot save a local copy of those in the RM client.


So there we have our first tool. The only thing left to do now in this already grossing blog post is creating an action we can use within our deployment sequences. Next to the tools seciont, you can find the Actions tab. Now create a new action as seen below. Notice that when you select the tool this action uses, the command is copied from the tool and can be modified for this action. This is convenient when using the same tool for multiple actions that require different parameter sets.


With all of the in place, you can start using this tool in deployment sequences.


In the following blog posts, we will discuss the Release Paths / Environments and get closer to the actual Release Templates. And in between, Tools and Actions for dealing with custom Dynamics AX stuff will pop up as we go.

Release Management for VS2013 … and AX : Part 1

In a previous post I mentioned Release Management for Visual Studio 2013 but I didn’t take the time to go through the capabilities and why I was using it. We have spent some time on creating automated builds for AX and these are up and running for while now using the code activities from Joris De Gruyter. Right now I am working together with my colleague Kevin Roos to investigate more automated deployments with AX2012 R3. Along the road, I hope to find the time to write some posts that provide you with some insights along the way. This post is the first of them.

So the first thing we thought of, is deciding which toolset we would be using. We didn’t want to invent hot water again and yet, most of the things available have their limitations. So a few brainstorm sessions later, we decided to go for the following set of tools:

Release Management for Visual Studio 2013

Formerly known as InRelease, Microsoft bought this solution from InCycle and incorporated this with the Visual Studio 2013 stack and named it Release Management for Visual Studio 2013. RM helps you to deploy your solutions accross the different stages in their lifecycle.

The pros are:

  • The tool enables the definition of environments, servers, environment stages to work on
  • A workflow based way of deploying between environment stages
  • The ability to create custom tools be used within deployments
  • Template designer to easily drag and drop servers, actions, … in your release templates.
  • Tight integration with TFS Builds (Deploy build outputs, trigger deployments from successful builds, …)

The cons:

  • The first and farmost the biggest downside of RM is the fact that the tools and actions used within release templates are not using the same type of activities as the TFS build CodeAcitivity types.
    It would be nice if Microsoft would add the possibility to use code activities as they are working on the product now. Then it would be possible to work on code activities that can be reused for both TFS Builds and RM Releases. (And we could again make use of the code activities Joris created)
  • The second con might only be clear to people already using RM. The RM client could use a more user friendly way of working your way around the different sections. I know this is not a big deal, but let me give an example of this. When deleting an action, you need to go through all of the templates manually to remove it before you can actually remove the action / tool. A reference tool or cascade remove would be nice 🙂

DynamicsAxAdmin sources (from Joris’ CodePlex)

Joris De Gruyter has created a set of code activities that can be used as TFS Build Activities and along with those activities, he has a whole set of tools and plumbing code that can be reused. And the most important thing here for us are the PowerShell CmdLets that are also available in his solution on codeplex. An good example of a useful CmdLet is the Get-ClientConfiguration CmdLet which will try to find the client configuration via registry, file, … and get all of the details like binary folders, installation directory, and so on. So along the way, we will try to reuse some of those CmdLets.

Dynamics Ax 2012 Management utilities

The management utilities will also be used to reuse the CmdLets from Microsoft to get stuff done in the model store, deploying reports, deploying portal, …

PowerShell module and ‘wrapper’ scripts

To do most of our own plumbing, PowerShell will be used. The idea is to create a PowerShell module that contains reusable functions to work with Dynamics Ax. This module will be used by the various tools and actions that we will within RM. Next to the module, we will create a few tools within RM and each of these tools will have a ‘wrapper’ PowerShell scripts that will look at the different actions that can be performed with that tool and the parameters used for these various actions. The reason for splitting into different tools is easier management from within RM and cleaner PowerShell files with seperation of concerns.

So all of the above results in the following overview.


Let’s take a look at the cleaning of AX artifact folders  to understand a bit more of what is going on there. The idea is to create a tool that  can be used by several actions. For the cleaning of artifact folders, we only have a 1-1 relationship between Actions and Tools. The AX Folder cleaner tool actually runs powershell and uses attached resources (powershell files) to do the work. All of the actions that use the tool can use these same resources but have a different command line to run them. More technical details on what is going on exactly will follow in the next post.