What about this new eXceLleNT Dynamics AX compiler?

As the new major release of Dynamics AX is released, we can now dig a little deeper into some of the new stuff available to us now. And boy did they put in a lot of effort! Previous major releases did bring us some new tools, but there was a balance between new tooling and new functionality. With this major release, the focus is definitely more on getting the technology stack up to date. One of those new things available to us, is a whole new nice and shiny compiler!


Many of you guys will recognize the picture below. In Dynamics AX 2012 RTM and FP, you could literally wait for hours for your compiler to finish. This was because of the single threaded 32-bit compiler in the client. R2 CU6+ brought some relieve by introducing a command line tool that could compile your application with multiple workers, but still the process of compiling X++ and CIL afterwards, was not quite what it should be.


But now those days are over! A new compiler rises! So let’s have a look and see what stuff this brings along.

XLNT (X++ language Toolkit aka Excellent)

XLNT is a framework written in C# that is similar to the Roslyn compiler framework (but much smaller) (https://roslyn.codeplex.com/).  It has been built up from scratch in C# because the old compiler stack (C++) was from a time where hardware and CPU power was not as powerfull as today.


Below you can see a simplified schema of how the compiler is built now. It uses a more phased approach where you are able to ‘plug in’ passes based on what you want to achieve (Compilation, Best Practice checking, …). You can see the compiler as a set of building blocks that you can use for different scenario’s.


Visual Studio Integration

In this major version of Dynamics AX, Visual Studio has become the IDE for all of your X++ source coding on a daily basis. And because of that, one of the two ways you can access the compiler is from within the GUI of Visual Studio (and the other one is a command line tool called xppc, which can be a post on it’s own)


To start a compilation, you can do one of two things:


  • you can just build your solution / project as you would do with any other C# application. This will invoke an incremental compile of what has changed. To be able to give you a good experience within Visual Studio, this type of compilation is triggered when you go around and change things and will only compile a subset of your model by only recompiling the netmodules in which the changed objects reside.
  • Another way of compiling is to use the new Dynamics AX menu to perform a build on the models you select. This is called a full compilation of the model and will compile all of the netmodules related to your selected models.


If you save your source code written in Visual Studio, every single method is persisted into its own XML file. The XLNT framework is responsible of writing to and reading back from the XML files. (aka stitching and unstitching).


Basic navigation (stuff like Go to declaration) relies on cross references. So to be able to do this, the compiler will contain a pass that handles cross references and bulk copies related data to the cross reference database (DYNAMICSXREFDB). Also keep in mind that when you do not need a building block like this (say in a nightly build), you can just leave it out and compilation time will be much less.

X++ Compilation

In previous versions, not all of the code was generated in MSIL. A part of the code was still compiled into P-Code and being interpreted at run time. Another part was transformed into MSIL by XSL transforming P-Code generated by the compiler. But as of now, this is entirely gone and everything is generated in MSIL managed code by the compiler!


On top of that, the new compiler is massively parallel : as Microsoft states: “This thing will get his hand on all of the resources it can take to do as many things in parallel as possible. It means that you need to run this on a machine with at least 16GB of memory for a full build. That’s just how it’s designed.”


Another feature which is really nice, is the diagnostics support. At each point in time, the compiler can output error messages that will be included in the diagnostic telemetry available for the application. Based on this telemetry data, you can get an insight into different aspects of compilations. Microsoft is able to act on compiler errors, crashes, … much quicker then before to improve the compiling experience.

Best Practices

Let’s not leave out best practices. As Peter Villadsen quoted : “BP’s are important, anyone who says different, has not been in the industry long enough and not suffered enough. They are the result of many years of industry practice and should not be disregarded.” And I could not agree more. For years know, we have been making an effort and succeeded in customizing Dynamics AX without introducing new best practice violations. And to help with this, the XLNT framework also provides the tooling for checking best practices and authoring custom rules.


The best practice checking tool, like the compiler, can be run automatically when compiling or can be called using a command line tool. And Microsoft has clearly taken this seriously because for the first time ever, they too have adopted a zero tolerance approach for all of their code. So yes, all best practices have been addressed!


So this is in a nuttshell what the new compiler is all about. Stay tuned for more!

Release Management for VS2013 … and AX : Part 1

In a previous post I mentioned Release Management for Visual Studio 2013 but I didn’t take the time to go through the capabilities and why I was using it. We have spent some time on creating automated builds for AX and these are up and running for while now using the code activities from Joris De Gruyter. Right now I am working together with my colleague Kevin Roos to investigate more automated deployments with AX2012 R3. Along the road, I hope to find the time to write some posts that provide you with some insights along the way. This post is the first of them.

So the first thing we thought of, is deciding which toolset we would be using. We didn’t want to invent hot water again and yet, most of the things available have their limitations. So a few brainstorm sessions later, we decided to go for the following set of tools:

Release Management for Visual Studio 2013

Formerly known as InRelease, Microsoft bought this solution from InCycle and incorporated this with the Visual Studio 2013 stack and named it Release Management for Visual Studio 2013. RM helps you to deploy your solutions accross the different stages in their lifecycle.

The pros are:

  • The tool enables the definition of environments, servers, environment stages to work on
  • A workflow based way of deploying between environment stages
  • The ability to create custom tools be used within deployments
  • Template designer to easily drag and drop servers, actions, … in your release templates.
  • Tight integration with TFS Builds (Deploy build outputs, trigger deployments from successful builds, …)

The cons:

  • The first and farmost the biggest downside of RM is the fact that the tools and actions used within release templates are not using the same type of activities as the TFS build CodeAcitivity types.
    It would be nice if Microsoft would add the possibility to use code activities as they are working on the product now. Then it would be possible to work on code activities that can be reused for both TFS Builds and RM Releases. (And we could again make use of the code activities Joris created)
  • The second con might only be clear to people already using RM. The RM client could use a more user friendly way of working your way around the different sections. I know this is not a big deal, but let me give an example of this. When deleting an action, you need to go through all of the templates manually to remove it before you can actually remove the action / tool. A reference tool or cascade remove would be nice 🙂

DynamicsAxAdmin sources (from Joris’ CodePlex)

Joris De Gruyter has created a set of code activities that can be used as TFS Build Activities and along with those activities, he has a whole set of tools and plumbing code that can be reused. And the most important thing here for us are the PowerShell CmdLets that are also available in his solution on codeplex. An good example of a useful CmdLet is the Get-ClientConfiguration CmdLet which will try to find the client configuration via registry, file, … and get all of the details like binary folders, installation directory, and so on. So along the way, we will try to reuse some of those CmdLets.

Dynamics Ax 2012 Management utilities

The management utilities will also be used to reuse the CmdLets from Microsoft to get stuff done in the model store, deploying reports, deploying portal, …

PowerShell module and ‘wrapper’ scripts

To do most of our own plumbing, PowerShell will be used. The idea is to create a PowerShell module that contains reusable functions to work with Dynamics Ax. This module will be used by the various tools and actions that we will within RM. Next to the module, we will create a few tools within RM and each of these tools will have a ‘wrapper’ PowerShell scripts that will look at the different actions that can be performed with that tool and the parameters used for these various actions. The reason for splitting into different tools is easier management from within RM and cleaner PowerShell files with seperation of concerns.

So all of the above results in the following overview.


Let’s take a look at the cleaning of AX artifact folders  to understand a bit more of what is going on there. The idea is to create a tool that  can be used by several actions. For the cleaning of artifact folders, we only have a 1-1 relationship between Actions and Tools. The AX Folder cleaner tool actually runs powershell and uses attached resources (powershell files) to do the work. All of the actions that use the tool can use these same resources but have a different command line to run them. More technical details on what is going on exactly will follow in the next post.

Technical conference 2014 : Day one

I’m putting a blog post online today because yesterday the Wifi was really letting us down. But anyway, it turned out to be a very interesting day!


It all started out with what Microsoft is good at, giving presentations! The keynote was packed with content and demo’s so we were immediately excited to see some of the demo’s.

  • Professional services
    This demo showed a few new neat features when it comes to project management inside Dynamics AX
  • Retail
    This was a really nice demo! It was packed with new features and showed the new mobile experiences and nice demo’s on online shops.
    I was really impressed with all of the features available for retail.
  • Warehousing
    Another interesting demo was about the Blue Horseshoe warehousing solution. This really made me realize that all of the features I have been implementing the last 5 years were actually in there.
    There was a nice demo of the mobile app that lets you do stuff like picking, …
    Another nice addition here is the possibility to diable warehouse processes for certain inventory locations. Someone from the Blue Horseshoe guys even noted that they have eliminated the need for quarantine warehouses. There is an extra status now on the on hand inventory screen where you can specify that tis inventory cannot be used for other processes.

 A day in the life of a developer

The first actual session I attended was called “A day in the life of a developer / IT Manager”.

When I look at my scribbles, there’s a lot in there. But let me just mention a few key things that I noticed:

  • Trace parser
    Developers use the trace parser not enough when they are finished coding a feature. Being proactive means tracing your code to get out the obvious flaws.
  • LCS model upload
    We are using the feature in lifecycle services to upload the model for customization analysis. People should use that more often too.
    There is actually a HTML report available in the same format as the compiler output so developers can import that in AX to directly access the corresponding code.
  • Machines for building
    When devs check in there code, there should obviously be a build process that builds the code. For this, developers request hardware but this should be kept in mind when looking at the build performace:

    • Put all of the components on one box
    • Put in at least 16GB of RAM
    • Do not constraint the SQL memory
    • Install KB2844240 to optimize index performance optimizations
    • Request faster CPU’s first instead of more CPU’s
    • Install an SSD on the build machine
    • Use AxBuild.Exe 🙂

Data export and Import Framework (DIXF)

The session about DIXF was also nice as a recap of what I already knew by working with DIXF at a customer site.

There were some interesting new features available in the R3 version:

  • Compare and copy data between companies (Only on entity level)
  • Copy entity data accross difference Microsoft Dynamics AX environments (Only on entity level)
  • MDM on top of DIXF
    Reuses DIXF for master data management (More in posts to come on this)
  • Out of the box support for 150+ entities
    • Master entities
    • Documents
    • Journals
    • Parameters
    • Reference data
    • Configuration data (EP Settings, Server Settings, Batch Settings, …)
  • Performance of staging to target
    There is support for parallel task bundling. Large files can be split up over several tasks.
  • There is a compare entity wizard that can compare entities between companies

Create AX Builds using the new server side compiler

For me, this was interesting because I have been playing around a lot with builds and the sessions turned out to be more like a Q&A session.

Firstly there was an explanation about how the compiler works. Roughly seen, there are 3 main parts:

  • Phase 1: Header compilation for tables and classes.
  • Phase 2: Complete X++ sources compiled into P-Code. Error log get created
  • Phase 3: Compile the error list. Here they did not actually managed that the number of passes depends on the errors. An number of error is incremented and as long are there are errors a new pass is tried. (Up to pass 5)

The important part here is that this is done on all of the tiers. The client performs the compilation and the communication goes all the way through the AOS to the SQL Database. But this has been solved and now the compilation is actually done on the server now. It is a 64bit compilation and X++ execution is kept to a minimum.

After the session, later on the evening, we met up with Robert Badawy which gave the sessions.  It was nice to have a chat with him and get his view on all of this and how Microsoft does the build process internally.

Debug Alerts in Dynamics Ax 2009

Today I was creating some alert rules and needed to make some modifications to the alerting functionality. Something didn’t go quite as expected, so I wanted to debug the execution of an alert action.

Here is how you can do it :

  1. Make sure the debug mode is set to ‘when breakpoint’ for the user linked to the alert
  2. Put a breakpoint in the EventJobCud class, method Run
  3. Change the EventJobCus class, method Run so that it does not use the runAs command :

( Revert these changes when the debugging is done, but for now you need to make this change otherwise you will not be able to step into the EventAction classes for debugging)

void run()
    EventCUD             event;
    container             params;
    RunAsPermission    runAsPermission;


    while select event
          group by UserId
          where (event.Status == BatchStatus::Waiting) &&
                (event.CompanyId == curext())
              params = connull();
              runAsPermission = new RunAsPermission(event.UserId);

              //BP Deviation Documented
              //runas(event.UserId, classnum(EventJobCUD), staticmethodstr(EventJobCUD, runCudEventsForUser),params,
              //      curext(), EventJobCUD::getLanguageId(event.UserId));




X++ run a process with user credentials

When you want to run for example a command prompt from within the X++ language, you can use the following code :
Let’s assume that the following parameters were passed to the method :

  • FileName : “C:\windows\system32\cmd.exe”
  • Arguments : “”
  • RunAsAdmin : true
  • UseShellExecute : true
  • WaitForExit : false


static void runWindowsProcess(  FileName    _fileName
                            ,   str         _arguments
                            ,   boolean     _runAsAdmin         = false
                            ,   boolean     _useShellExecute    = false
                            ,   boolean     _waitForExit        = false)
    System.Diagnostics.ProcessStartInfo processStartInfo;
    System.Diagnostics.Process          process;
    System.OperatingSystem              operatingSystem;
    System.Version                      operatingSystemVersion;
    System.Exception                    ex;
    int                                 major;
    Boolean                             start;

        // Assert CLRInterop permission
        new InteropPermission(InteropKind::ClrInterop).assert();

        // BP deviation documented
        processStartInfo    = new System.Diagnostics.ProcessStartInfo(_fileName, _arguments);

        // BP deviation documented
        process             = new System.Diagnostics.Process();

        // Get an instance of the System.OperatingSystem class
        operatingSystem         = System.Environment::get_OSVersion();

        // Get the OS version number
        operatingSystemVersion  = operatingSystem.get_Version();

        // If admin rights asked
            major = operatingSystemVersion.get_Major();

            // Get the major part of the version number (Starting from Windows Vista)
            if(major >= 6)
                // Start using the runAs property which will prompt for administrator credentials

        // Use the system shell to execute the process

        // Attach the starting information to the process

        // Start the process and wait for it to complete
        start = process.Start();

        // Wait for the process to finish

        // Revert permission
        // BP Deviation documented
        ex = ClrInterop::getLastException();

        while (ex != null)

            ex = ex.get_InnerException();

        throw error(strFmt("@HEL270", _fileName));

Dynamics AX 6.0 – SQL AOD

Owkay, have a seat and hold on to whatever you can hold on to…

As of the beginning of AX (Axapta) we have always been used to working with AOD files. But it seems that the days of the AOD fileformat are numbered !

There are a couple of reasons why the use of AOD files has become slightly obsolete :

  • The MorphX environment has been decreasing in performance because of the increasing number of objects in the AOT
  • Searching for text in the AOT nodes takes a huge amount of time

So if AOD files don’t cut it anymore, what is there to be used then? Well Dynamics Ax 6.0 will be the first version ever to have the AOT metadata stored in the SQL Server backend !!
For the users using MorphX there is no difference except for the fact that it will be much more responsive, faster, … (fe searching for a text in all the methods of all forms completes in 2 seconds ! )

The next question will probably be : How are we going to transport between different Ax environments…? Well the answer is the new fileformat : *.axModel
This format is also a binary fileformat as the AOD file but they are much smaller then AOD files (Half the size) and there will be a new tool to export/import them into a SQL Database.

The client will be configurable to point to a certain model file to use in the client session. So it will be possible to fire up an Ax client and choose a certain model file to work in. Then you can create your solution and when you’re done all the changes will be available in that .axModel file and will be easily transported to other environments.

This will be a benefit in several situations, but if you want to read the full explanation, you can find a lot of info here : http://blogs.msdn.com/mfp/

Dynamics Ax Get and Remove text from the Infolog

Sometimes you want a process to log infolog data into a text field instead of sending it to the infolog. For example you are posting 6 custom made journalTables and if something goes wrong you want the infologdata to be stored in a Log field on the table and you want to continue with the processing of the other journalTables.

Then you will have a catch statement where you will need to get the infolog data and store it into the field but there are some difficulties :

  • First you want only the data generated by your process and not the data that might have already been in the infolog.
  • Second you want to clear the errors from you code from the infolog without clearing all the rest of the infolog data.

The key is to work with the infolog.line() method to remember when to start keeping track of the lines in the infolog.

We can demonstrate this with the following job :

static void LIB_KeSae_TestInfologRemoval(Args _args)
    int line;
    str omittedText;
    // Now we have 2 records in the infolog
    // From now on we want the following entries in the infolog to be stored
    // somewhere and omitted from the infolog window
    // Remember the number of entries before the custom code (in this case line = 2)
    line = infolog.line();

            //Customer code
    // Here we want to retrieve statement 3, 4 and 5 en clear those from the infolog
    omittedText = global::LIBGetTextFromInfoLog(line + 1, true);

    // Some infolog entries after our code

The following code is the code from the GetTextFromInfolog method :

public static str LIBGetTextFromInfoLog(int _startLine = 1, boolean _clearInfo = true)
    int currentLine;
    str retValue;

        Do not use infolog.infologData to get the infolog data because that creates a copy and clears the actual infolog.
        Then the infolog.line() method will return 0 afterwards and you cannot clear the selected lines anymore

    for(currentLine = _startLine; currentLine < = infolog.line(); currentLine++)
        retValue += 'n' + infolog.text(currentLine);

    // _startLine -1 because the infolog method will add 1
        infolog.clear(_startLine -1);

    return retValue;

The infolog.clear() method will now take the startline into account and only clear the infolog entries from our code and leave the other entries intact as shown in the result of this job :

Result of the infolog after removing entries

Result of the infolog after removing entries

Dynamics Ax Disable / Enable all datasource fields

Sometimes you want to disable all fields on a form’s datasource and enable some of them afterwards. Put the following piece of code in the Global class and you can call it from wherever you like.

public static void LIBSetAllTableFields(FormDataSource _formDataSource,Boolean _value)
    DictTable       dictTable = new DictTable(_formDataSource.table());
    int             i;
    int             fieldNumber;
    for(i = 1;i < = dictTable.fieldCnt();i++)
        fieldNumber = dictTable.fieldCnt2Id(i);

Generic SysTableLookup Method

It is often needed to write a custom lookup method and the SysTableLookup class can be useful to create lookups from code. However the following method uses the SysTableLookup in the background but can be called easier.

When using the SysTableLookup class, for most of the simple lookups (1 datasource table) it is alway the same. You need the following :

TableId, LookupControl, LookupFields, ReturnFields, SortFields and sometimes the use of a tmpTable.

Now following method is a generic method to user the SysTableLookup class :

public static void doLookup(TableId             _tableId,
                            Container           _lookupFields,
                            Container           _sortFields,
                            FormStringControl   _control,
                            FieldId             _returnItemFieldNum,
                            Map                 _queryRanges    = null,
                            Boolean             _useTmpTable = false,
                            Common              _tmpBuffer = null
    SysTableLookup          sysTableLookup  = SysTableLookup::newParameters(_tableId, _control);
    Query                   query           = new Query();
    QueryBuildDataSource    qbds;
    int                     i;
    fieldId                 lookupFieldId;

    for(i=1;i <= conlen(_lookupFields);i++)
        lookupFieldId = conPeek(_lookupFields, i);

        if(lookupFieldId == _returnItemFieldNum)
            sysTableLookup.addLookupfield(lookupFieldId, true);

    qbds = query.addDataSource(_tableId);

    for(i=1;i <= conlen(_sortFields);i++)
        qbds.addSortField(conPeek(_sortFields, i));

        rangeEnumerator = _queryRanges.getEnumerator();

        while (rangeEnumerator.moveNext())


Now when you want to create a lookup you can do it easier by doing the following :

public void lookup()
    Container   fieldNums       = [FieldNum(CustTable, AccountNum), FieldNum(CustTable, Name)];
    Container   sortFields      = [FieldNum(CustTable, AccountNum)];
    FieldId     returnFieldId   =  FieldNum(CustTable, AccountNum);
    Map         queryRanges     = new Map(Types::Integer, Types::String);
    queryRanges.insert(FieldNum(CustTable, AccountNum), '4000');
    LIBSysTableLookup::doLookup(TableNum(CustTable), fieldNums, sortFields, this, returnFieldId, queryRanges);

So the only thing you need to do is specify the fields, returnfields and sortfields…

Ans let’s look at the following example : We need a lookup with a temporary table. Then we can do it like this :

Container   fieldNums       = [FieldNum(TmpIdRef, Name), FieldNum(TmpIdRef, HelpTxt)];
    Container   sortFields      = [FieldNum(TmpIdRef, Name)];
    FieldId     returnFieldId   = ConPeek(fieldNums, 1);
    TmpIdRef    tmpTable;
    tmpTable = LIBDifferenceAction::BuildActionClassList();

    LIBSysTableLookup::doLookup(TableNum(TmpIdRef), fieldNums, sortFields, this, returnFieldId, true, tmpTable);

EventHandlers In Ax 2009

It is possible to attach event listeners to objects in X++.
The next code sample shows the usage of the SysEventBroker and SysEventInfo classes to handle events :

static void SysEventTest(Args _args)
SysEventBroker broker = SysEventBroker::construct();
SysEventInfo info1 = new SysEventInfo("Info-1"); // catch all;
SysEventInfo info2 = new SysEventInfo("Info-2"); // catch specific class
SysEventInfo info3 = new SysEventInfo("Info-3"); // catch specific type
// Setup
broker.addListener(info1, classNum(Object));
broker.addListener(info2, classNum(Runbase));
broker.addListener(info3, classNum(Object), "Fire 2"); // catch Fire 2
broker.addListener(info3, classNum(Object), "Fire 4"); // and Fire 4

broker.fireEvent(classNum(CustAutoCreate), "Fire 1", "This is my life");
broker.removeListener(info2, classNum(Runbase));
broker.fireEvent(classNum(CustAutoCreate), "Fire 2", "this is my time");
broker.fireEvent(classNum(CustAutoCreate), "Fire 3", "just show me the light");
broker.fireEvent(classNum(CustAutoCreate), "Fire 4", "and I go there");