Release Management for VS2013 … and AX : Part 1

July 23rd, 2014 No comments

In a previous post I mentioned Release Management for Visual Studio 2013 but I didn’t take the time to go through the capabilities and why I was using it. We have spent some time on creating automated builds for AX and these are up and running for while now using the code activities from Joris De Gruyter. Right now I am working together with my colleague Kevin Roos to investigate more automated deployments with AX2012 R3. Along the road, I hope to find the time to write some posts that provide you with some insights along the way. This post is the first of them.

So the first thing we thought of, is deciding which toolset we would be using. We didn’t want to invent hot water again and yet, most of the things available have their limitations. So a few brainstorm sessions later, we decided to go for the following set of tools:

Release Management for Visual Studio 2013

Formerly known as InRelease, Microsoft bought this solution from InCycle and incorporated this with the Visual Studio 2013 stack and named it Release Management for Visual Studio 2013. RM helps you to deploy your solutions accross the different stages in their lifecycle.

The pros are:

  • The tool enables the definition of environments, servers, environment stages to work on
  • A workflow based way of deploying between environment stages
  • The ability to create custom tools be used within deployments
  • Template designer to easily drag and drop servers, actions, … in your release templates.
  • Tight integration with TFS Builds (Deploy build outputs, trigger deployments from successful builds, …)

The cons:

  • The first and farmost the biggest downside of RM is the fact that the tools and actions used within release templates are not using the same type of activities as the TFS build CodeAcitivity types.
    It would be nice if Microsoft would add the possibility to use code activities as they are working on the product now. Then it would be possible to work on code activities that can be reused for both TFS Builds and RM Releases. (And we could again make use of the code activities Joris created)
  • The second con might only be clear to people already using RM. The RM client could use a more user friendly way of working your way around the different sections. I know this is not a big deal, but let me give an example of this. When deleting an action, you need to go through all of the templates manually to remove it before you can actually remove the action / tool. A reference tool or cascade remove would be nice :)

DynamicsAxAdmin sources (from Joris’ CodePlex)

Joris De Gruyter has created a set of code activities that can be used as TFS Build Activities and along with those activities, he has a whole set of tools and plumbing code that can be reused. And the most important thing here for us are the PowerShell CmdLets that are also available in his solution on codeplex. An good example of a useful CmdLet is the Get-ClientConfiguration CmdLet which will try to find the client configuration via registry, file, … and get all of the details like binary folders, installation directory, and so on. So along the way, we will try to reuse some of those CmdLets.

Dynamics Ax 2012 Management utilities

The management utilities will also be used to reuse the CmdLets from Microsoft to get stuff done in the model store, deploying reports, deploying portal, …

PowerShell module and ‘wrapper’ scripts

To do most of our own plumbing, PowerShell will be used. The idea is to create a PowerShell module that contains reusable functions to work with Dynamics Ax. This module will be used by the various tools and actions that we will within RM. Next to the module, we will create a few tools within RM and each of these tools will have a ‘wrapper’ PowerShell scripts that will look at the different actions that can be performed with that tool and the parameters used for these various actions. The reason for splitting into different tools is easier management from within RM and cleaner PowerShell files with seperation of concerns.

So all of the above results in the following overview.


Let’s take a look at the cleaning of AX artifact folders  to understand a bit more of what is going on there. The idea is to create a tool that  can be used by several actions. For the cleaning of artifact folders, we only have a 1-1 relationship between Actions and Tools. The AX Folder cleaner tool actually runs powershell and uses attached resources (powershell files) to do the work. All of the actions that use the tool can use these same resources but have a different command line to run them. More technical details on what is going on exactly will follow in the next post.

Failed to create a session revisited

July 15th, 2014 No comments

Today I was experiencing this error again when working a new Hyper-V with imported demo data in AX 2012 R3:

Failed to create a session; confirm that the user has the proper privileges to log on to Microsoft Dynamics

In a previous post, I stated a solution for this, but today, the script in that post didn’t cut it. And the reason is that there are more partitions in it now. So I wanted to tell you a bit more on how to solve it.

The cause of the issue remains in the UserInfo table. But to show you, first start up a client in the Initial partition and open the partition form.


When we add a partition here, there is some stuff going on in the background. The AOS will also create new user entries in the UserInfo table for the ‘Admin’ and ‘guest’ users. When you modify the client config to log on to the new partition, it will work and you can run the partition initialization checklist.



But, when importing demo data using the Data Transfer Tool, partitions are imported, but the background procedure of adding the Admin and Guest users to the UserInfo table is not executed. So the real problem is that the UserInfo table does not contain the Admin and Guest records for each of the partitions. So you need to make sure that for every partition available there is a matching pair of records in the UserInfo table. (In my case I created the records and played with the recId’s, but be careful with that.)

ListPageInteraction lookup

July 7th, 2014 No comments

Small post and to the point :)

I wanted to point out a small detail that I was forgotten today in AX2012. When you want to add custom logic to a ListPage form, you might want to write your own custom ListPageInteraction class. So what you need to is just create a class that extends from the ListPageInteraction class and then you can select it in the combo box on the corresponding form, right? Well the answer is no!

There’s a small detail here that might be missed. The kernel does not only look for classes that extend ListPageInteraction, but your classname also needs to end with ‘ListPageInteraction’. So when you add a class and name it so long that you have to shortnen it (MyTooLongClassNameForAListPInteract) it won’t show up in the list! Rename it to MyListPageInteraction and it shows up in the dialog!

Windows Azure : Copy VHD between subscriptions

May 13th, 2014 1 comment

Last week I was playing around with LifeCycle Services and tested out the ‘Cloud hosted environments’ feature. This is an awesome feature that will save many of you some hours in prepping demo virtual machines. (For now, it is only able to deploy a template environment suited for demo purposes, but more templates will be available later on)

But, it didn’t take long until I discovered my first ‘missing feature’ here… namely I was stupid enough to link the wrong subscription to the LCS project and I ended up adding my pay-as-you-go subscription instead of the one with acual credit on it :| Soon after that I realized there is no option to unlink it again. So there you have it, the LCS project and one of my subscriptions are married to one another now!

So I had two options here:

  • Delete the VM in Azure and redeploy a new one from LCS
  • Copy the VM from one subscription to another.

The first one was the easy one, so to have a bit of a challenge I went and chose the second option. Let’s write some PowerShell stuff to get things copied from one subscription to another! Here is how you have to do just that:

Install the Azure Powershell module

You can download and install the Azure PowerShell module by running the Microsoft Web Platform Installer. When finished you can fire it up in Windows 8 by just typing “Power” at the home screen and you will see it right away :).

Get you machine familiar with your Azure subscriptions

To be able to run the powershell commands in the next part, you will first need to import your subscription settings on your machine. Your first command here will be:


When running this cmdlet, your browser will pop up and take you to the correct link to download the publish settings files. When you have downloaded the files, run the following cmdlet to import it on your machine:

Import-AzurePublishSettingsFile "Path\subscriptionname.publishsettings"

Copy the VHD disks between subscriptions

Now that we are all set up, it’s time to start copying the virtual disks between the source and target subscription. Use the following script to do so.

### This assumes you have already imported the azure publishsettings file and the subscriptions are known on your machine.
Select-AzureSubscription "MyCoolSubscriptionName" 
### Defines the location of the VHD we want to copy
$sourceVhdUri = ""
### Defines the source Storage Account
$sourceStorageAccountName = "mysourcestorageaccountname"
$sourceStorageKey = "mywaytolongsourcestoragekey"
### Defines the target Storage Account
$destStorageAccountName = "mytargetstorageaccountname"
$destStorageKey = "myalsowaytolongtargetstoragekey"
### Create a context for the source storage account
$sourceContext = New-AzureStorageContext  –StorageAccountName $sourceStorageAccountName `
                                        -StorageAccountKey $sourceStorageKey  
### Create a context for the target storage account
$destContext = New-AzureStorageContext  –StorageAccountName $destStorageAccountName `
                                        -StorageAccountKey $destStorageKey  
### Name for the destination storage container
$containerName = "dynamicsDeployments"
### Create a new container in the destination storage
New-AzureStorageContainer -Name $containerName -Context $destContext 
### Start the async copy operation
$blob1 = Start-AzureStorageBlobCopy -srcUri $srcUri `
                                    -SrcContext $srcContext `
                                    -DestContainer $containerName `
                                    -DestBlob "AX2012R3_DEMO_OS.vhd" `
                                    -DestContext $destContext
### Get the status of the copy operation
$status = $blob1 | Get-AzureStorageBlobCopyState 
### Output the status every 10 seconds until it is finished                                    
While($status.Status -eq "Pending"){
  $status = $blob1 | Get-AzureStorageBlobCopyState 
  Start-Sleep 10

If you filled in the specifics of your own subscriptions, storage accounts, … you should see the following output while copying.

CopyAzureVHDCopy the virtual machine between subscriptions

Now that have transferred the virtual disks between subscriptions, we still have to copy the virtual machine between the two. And to do that, there is a nice script available on TechNet Script Center : Copy a Virtual Machine between subcriptions

So there you have it, your VM is now copied to a second subscription and if you like, you can just remove it from the source subscription.


Release Management – Customize timouts on default tools / actions

April 14th, 2014 No comments

In Release Management for Visual Studio 2013, you can customize your tools and actions to work with. The last few days I was creating a Release template for use with Dynamics AX 2012. One of the actions early in the template is backing up the database. ( Of course you need to backup first :-) ) And there I faced a little inconvenience: I could not adjust the timeout of the restore action but this was needed as an AX model store database restore could not be completed within 5 minutes.

So I decided to go back to the Restore SQL Database action and adjust the timeout setting. But hey, you are unable to do that on the default tool!


There is a parameter to setup default timeouts, but this only applies when creating new actions. So that doesn’t help a lot.


So for me, the most convenient and simple solution was to copy the tool that is available out of the box and create a new one from it. Then you do have the ability to set the timeout.




The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.

March 17th, 2014 No comments

Today I experienced this problem while importing models from the build machine into one of the developer machines:

The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.

A quick search on the internet took me to this blog post that explains the importance of initializing / updating your model store after installing CU’s. To initialize/update your model store you simply have to run the AxUtil with the schema command. (Alternative for powershell is Initialize-AXModelStore)

For a full list of AXUtil and PowerShell commands, please refer to MSDN.

Announcing Microsoft Dynamics AX 2012 R2 Services

March 5th, 2014 No comments

Hello everybody!

I’m very happy to announce Microsoft Dynamics AX 2012 R2 Services, published byPackt Publishing and authored by Klaas and myself. It is the successor of Microsoft Dynamics AX 2012 Services which was very well received. Based on the feedback you had and based on the new R2 version, we started working on an update in September 2013. We have gone through all existing content, updated and improved it, and added a lot of new content. We are really happy with the result and are excited to hear your reactions.

We are working hard on the book at the moment, the expected release date is May 2014 but you can already pre-order it.

There are also a couple of giveaways that can be won! More information can be found in the full blog post created by Klaas.


The Registry.CurrentUser impediment

February 19th, 2014 No comments

Today I was working on creating a build process template for AX2012. I will not go into the details about the build process template and the custom activities as these activities will be covered by Joris De Gruyter later on. But still, I wanted to show you guys a nasty little problem I was experiencing yesterday.

The build process threw an exception : ” Object reference not set to an instance of an object

So, this was of course a null reference, but after some investigation ( by using primitive logging because this was running in the agent process :-) ) it turned out to be located in this code:

public static string GetInstallDir32()
    RegistryKey componentsKey = Registry.CurrentUser.OpenSubKey(regKey, false);
    String installDir32 = componentsKey.GetValue("InstallDir32", null) as String;
    return System.Environment.ExpandEnvironmentVariables(installDir32);

It turned out to be the Registry.CurrentUser property that returned null. And then it struck me, the agent ran under a service account s_tfs_build and since this account was never used to log on interactively, the profile of this user was not loaded up to that point.

The simple solution was to log on to the system with the service account so that the user profile and registry hive were loaded. From then on, the Registry.CurrentUser property was properly filled.

Technical conference 2014 : Day two

February 5th, 2014 No comments

Today we were present early for the second day at the technical conference. The plan for today was to follow these sessions:

  • Data synchronisation between multiple instances using Master Data Services
  • optimizing the performance of an AX deployment
  • Compelling report designs
  • Help! MRP is slow

I could put all of my notes from today here, but there will be plenty of blog posts dealing with that today and I’ll just put in the fragments that I’ll really remember from today:

  • MDS
    • The master data services are using SQL  server and are working on an entity level to copy data between AX instances.
    • Change tracking is used but this does cause some overhead so pay attention to that and take steps to see if that is acceptable in your production environment.
  • Performance
    • Pay attention to your batch group setup when multi-threading your batch jobs. RunTimeTasks wille be created in your empty batchgroup if you do not specify the group via code!
    • Parameter sniffing
    • The rest was actually stuff that everyone should already know when working with Ax for a couple of years.
      Index optimization, recordset based operators, …
    • The 5 key points when looking at performance:
      • Few or many calls
      • Find where you lose time
      • Running or waiting : Find our where your code is waiting instead of running. Total elapsed time for SQL statements should be as close to the actual running time as possible.
      • 20 / 80 : With 20% of your effort, you can solve 80% of your performance problems.
      • 80 / 10 / 10: 80% is in your code, the 20% left is hardware en configuration.
  • Compelling report designs
    • TJ Vassar is by far the most interesting figure on stage the last few days! In 3 words: Chill, funny and honest :)
    • To be able to play around with your design and not having to run it from inside of AX to have the right data from your data provider, you can make your temporary table non-temporary, fill it with data and just switch over to Visual Studio entirely. Also do not forget to implement the processReport() method and comment out the super(). This is to make sure when running the report in SSRS, it is not processed within AX as long as you are working on your layout.
    • Microsoft really took the feedback to heart and fixed lots of issue concerning reporting. Including new tools for comparing reports.
    • Emailing parameters! A customization that lot’s of us have made is adding functionality to the print settings to be able to specify more and selecting email addresses from a dropdown. Well now it’s in the standard.

Technical conference 2014 : Day one

February 4th, 2014 No comments

I’m putting a blog post online today because yesterday the Wifi was really letting us down. But anyway, it turned out to be a very interesting day!


It all started out with what Microsoft is good at, giving presentations! The keynote was packed with content and demo’s so we were immediately excited to see some of the demo’s.

  • Professional services
    This demo showed a few new neat features when it comes to project management inside Dynamics AX
  • Retail
    This was a really nice demo! It was packed with new features and showed the new mobile experiences and nice demo’s on online shops.
    I was really impressed with all of the features available for retail.
  • Warehousing
    Another interesting demo was about the Blue Horseshoe warehousing solution. This really made me realize that all of the features I have been implementing the last 5 years were actually in there.
    There was a nice demo of the mobile app that lets you do stuff like picking, …
    Another nice addition here is the possibility to diable warehouse processes for certain inventory locations. Someone from the Blue Horseshoe guys even noted that they have eliminated the need for quarantine warehouses. There is an extra status now on the on hand inventory screen where you can specify that tis inventory cannot be used for other processes.

 A day in the life of a developer

The first actual session I attended was called “A day in the life of a developer / IT Manager”.

When I look at my scribbles, there’s a lot in there. But let me just mention a few key things that I noticed:

  • Trace parser
    Developers use the trace parser not enough when they are finished coding a feature. Being proactive means tracing your code to get out the obvious flaws.
  • LCS model upload
    We are using the feature in lifecycle services to upload the model for customization analysis. People should use that more often too.
    There is actually a HTML report available in the same format as the compiler output so developers can import that in AX to directly access the corresponding code.
  • Machines for building
    When devs check in there code, there should obviously be a build process that builds the code. For this, developers request hardware but this should be kept in mind when looking at the build performace:

    • Put all of the components on one box
    • Put in at least 16GB of RAM
    • Do not constraint the SQL memory
    • Install KB2844240 to optimize index performance optimizations
    • Request faster CPU’s first instead of more CPU’s
    • Install an SSD on the build machine
    • Use AxBuild.Exe :-)

Data export and Import Framework (DIXF)

The session about DIXF was also nice as a recap of what I already knew by working with DIXF at a customer site.

There were some interesting new features available in the R3 version:

  • Compare and copy data between companies (Only on entity level)
  • Copy entity data accross difference Microsoft Dynamics AX environments (Only on entity level)
  • MDM on top of DIXF
    Reuses DIXF for master data management (More in posts to come on this)
  • Out of the box support for 150+ entities
    • Master entities
    • Documents
    • Journals
    • Parameters
    • Reference data
    • Configuration data (EP Settings, Server Settings, Batch Settings, …)
  • Performance of staging to target
    There is support for parallel task bundling. Large files can be split up over several tasks.
  • There is a compare entity wizard that can compare entities between companies

Create AX Builds using the new server side compiler

For me, this was interesting because I have been playing around a lot with builds and the sessions turned out to be more like a Q&A session.

Firstly there was an explanation about how the compiler works. Roughly seen, there are 3 main parts:

  • Phase 1: Header compilation for tables and classes.
  • Phase 2: Complete X++ sources compiled into P-Code. Error log get created
  • Phase 3: Compile the error list. Here they did not actually managed that the number of passes depends on the errors. An number of error is incremented and as long are there are errors a new pass is tried. (Up to pass 5)

The important part here is that this is done on all of the tiers. The client performs the compilation and the communication goes all the way through the AOS to the SQL Database. But this has been solved and now the compilation is actually done on the server now. It is a 64bit compilation and X++ execution is kept to a minimum.

After the session, later on the evening, we met up with Robert Badawy which gave the sessions.  It was nice to have a chat with him and get his view on all of this and how Microsoft does the build process internally.