I often see code that still uses the WinApi::GetTickCount method to measure an interval of time. And though there is nothing wrong with it, I wanted to show an alternative method : the StopWatch already known in .Net for quite some time.
The sample below show the usage of the stopwatch while I was testing something out with looping all of the tables in the system:

static void FetchTableNames_ModelElementTables(Args _args)
    SysModelElement element;
    SysModelElementType elementType;
    System.Collections.ArrayList tableNames = new System.Collections.ArrayList();
    System.Diagnostics.Stopwatch stopWatch = new System.Diagnostics.Stopwatch();
    int64 elapsed;


    // The SysModelElementType table contains the element types and we need the recId for the next selection
    select firstonly RecId
    from elementType
    where elementType.Name == 'Table';

    // With the recId of the table element type, select all of the elements with that type (hence, select all of the tables)
    while select Name
    from element
    where element.ElementType == elementType.RecId


    // Get the time it took
    elapsed = stopWatch.get_ElapsedMilliseconds();

    info(strFmt("Time taken: %1", elapsed));

Microsoft Dynamics AX 2012 Services ready!

Hi to all of you!

It has been a while since I have posted some useful content here. Mainly to the fact that I was focussing on getting the content in the book.

And now we can finally say that it is finished, reviewed and approved! So as we speak the presses are rolling to get it printed. All of you who have pre-ordered a copy, thanks for your patience, it will be rewarded 🙂

Book cover

Book cover

Dynamics Ax 2012 R2 VPC : new features?

Hi everyone, This is actually more a question that a post 🙂

Last week, I have downloaded the R2 virtual machine and fired it up to venture in the new capabilities of Dynamics AX 2012 R2. (And yes, shamefully I missed the Technical conference !)

Once the thing booted up, I could immediately  see that the model store was indeed separated from the data. I like!
But then I tried to look for a couple of other new features (like fe LINQ) but could not find any of them… ?!

Are most of the features already present in the virtual machine or have they been left out for now??? Any comments would be greatly appreciated!

Introducing Microsoft Dynamics AX 2012 Services!

Today I can finally present you my first book that is published : Microsoft Dynamics AX 2012 Services!

As you might have guessed by the title of the book, it covers everything there is to know about services within Dynamics AX 2012. This includes topics like the service architecture, WCF, document services, custom services , consuming web services, the SysOperation framework and much more. You can visit the Packt Publishing website for a more detailed description of the contents.

I have authored this book together with Klaas Deforche , a colleague and friend of mine . So my first thanks go out to him. Also, I want to thank the reviewers for providing their insights.

We are working hard to finalize the book, but it is already available for preorder. For those who are planning on getting a copy of it, preordering might be a good plan since there is a nice discount for preorders.

And as we are still finalizing the book, there is the opportunity for all of you to post your ideas or things you would like to see covered in the book. We might actually take it into account if possible.

Preorder here!


Data Migration Framework Hands-on

Architecture of the DMF

The Data Migration Framework uses a three-step process to import data into Dynamics AX.

  • Source files are imported into staging tables
  • Data is transferred from the stating tables to the target tables
  • Staging table cleanup

The import of source files is handled by SQL Server using DTSX (Data Transformation Services) packages. DTSX packages running on SQL Server can import the source data in the Dynamics AX database much faster than any other means.

Once imported in AX, data can be transferred from the staging tables to the target tables. This can be done in batch jobs to handle larger data sets.

The different stages allow us to control the data flow and adjust data in the staging tables. Data is imported quickly in Dynamics AX and from then on we can handle the data without any dependencies outside Dynamics AX.

Source data formats

Formats define the parameters for reading the source files. You can setup the delimiters and code pages used to read the file. For now, only the File type is available so you can choose between Delimited and Fixed Length files.

Behind the Enum field, there are other Types available that suggest the ability to transfer data from on Dynamics AX environment to another. But for now, only the type File is supported.

Target entities

Go to Data Migration Framework > Setup > Target entities.

There are some target entities available out of the box. Target entities describe a grouping of tables that define a logical entity. For example: The product entity defines a product and therefor contains all of the tables related to a Product : InventTable, InventTableModule, …

Target entities can be used to define the following:

  • The staging table for the entity
  • The entity class that is linked to the entity where you can put addition logic
  • The target entity class (which is a query representing the data tables to target)
  • The mapping between the staging data and the target entity

For each entity, you can view the structure of the entity. This contains all of the physical tables that make up the entity.

The target mapping is generated, but can always be validated and changed if specific requirements are in place. There is also a possibility to create functions to use in the mapping. Note that this is the mapping between the staging table and the Target Dynamics Ax UnitOfMeasurement table.

Processing groups

Group definition

Processing groups are used to group entities together for import.

Processing Group Entities

For each processing group, you can define the entities involved with this group.

For all of the entities you can define the following:

  • The source data format to use
  • The file path of the file to use
  • Whether or not to perform insert or doInsert
  • Whether or not to perform record validation upon insert

Source file

Concerning the source file, you have to options:

  • When you have a source file
    Then you can just go ahead and select the source file and the format to be used
  • When you need to create a source file
    Here you need to click Generate source fileto launch a wizard that will guide you through the steps to generate a source file.For the above setup, the Generate sample file button will create the following file: (Here the data is already filled in, but the wizard creates the headers.
    Source mapping

When you have an processing group entity selected, click the generate source mapping to create a default mapping and then click modify source mapping to view the mapping of the source file to the staging table in Dynamics AX.


Preview source file

Once the source file is selected and the source mapping is done, we can preview the contents of the file by clicking the Preview source file button.

Import staging data

On the processing group form, click the Get staging data button. A dialog will appear showing the ID and Description of the processing job that will be created. Fill in a description and click ok.

Following the dialog, the Staging data execution
form appears. This form contains details for all of the entities contained in the processing group such as : start time, end time and status.

From here on, you can choose to run the job immediately or to run it on the AOS in batch. So, depending on what you want to do here, click the Run or the Run on AOS button.

After running the job, an infolog is shown informing how many records were inserted in the staging table(s).

To view the staging data, refer to the paragraph explaining the Execution History.

Copy data to target

Back on the processing group form. Click the Copy data to target button. This opens up a dialog where you can specify a job to process. The selected job is the job that created the staging data. It is this staging data that we are going to transfer from the staging table(s) to the target table(s).

You can also choose to run the data copy for all of the records or for the records specified by the criteria:

  • Records that were already processed with errors
  • Records that were selected by the user

Click the OK button and the Target data execution
form appears. This is a similar form as the source data execution form, but here the information is about data being transferred to the target table(s).

Here you can also choose to run it immediately or execute this on the AOS in batch. The result should be an infolog informing about the number of records written to the target table(s).

Execution History

On the processing group form, click the Execution History button to view the history for the selected group. This will open up the history form which shows the following information:

  • The Jobs that have been created for the entities contained in the processing group
  • Staging and target status
  • Staging and target details (Start time, end time, number of records, …)

To view the staging data after importing data into the staging table(s), you can use the View staging data button. It opens a form for the staging table and show the records. Here you can do the following:

  • Modify staging data
  • Validate the data against the target table
  • Select records to filter on when copying data to the target tables
  • Use the target button to open the destination form (In this case the Unit of measurement form)

Back on the Execution History form, you can use the Log button to view the logging details for all the previous steps.

Ax 2012 and ETW (Event Tracing for Windows)

Now let’s get into to business quick and start off with a question : “How often did you want to reproduce / debug a certain process without actually knowing where the source of the problem may reside?”

Well the answer to that question for most of us will be : “Quite a lot” and then this would be followed immediately with the question : “Did you then create data log tables to store values, stack traces, information messages, …?” And again the answer will be often : “Yes”

Now this will probably help you out : ETW!

I will not go into the full details of Event Tracing for Windows as this would definately take us too far, but here’s a link where you can do some additional reading :

What we will look at for this post is how we can make advantage of ETW to do some tracing within Dynamics Ax 2012. Specifically we are going to make a custom piece of code do output some informational messages that can be caught by a data collector set using Perfmon and viewed by the Event viewer.

The following question may arise : “Why would we want ETW to handle the tracing and not stay in our habitat of X++ and storing data in tables?” Well the answer to this is:

  • ETW by far does not cause as many overhead as normal logging would do. Both for execution time (CPU) and diskspace.
  • If your code is ETW ready, you can do logging in production environments by starting / stopping data collector sets.
  • If your code is ETW ready, logging does not need to be built when needed but will immediately be available when needed.
  • The log files can be used in Event Viewers and also SCOM so it helps system administrators to know what is going on in the black box Dynamics Ax.

So let us jump in and create the data collector set.

First start op by opening perfmon and navigate to the user defined data collector sets.

Right click and create a new data collector set.



After the creation of the data collector set, right click it and you can start / stop the collection of ETW events.

But now that we have the collection part, we also need to make custom code trigger some events from within Dynamics Ax. And that is where the xClassTrace  class comes into play. The xClassTrace class is used to make advantage of ETW and is used to start / stop traces to files, log component messages in ETW to be caught by data collector sets, …

So let us take a look at some sample code that loops all customers and logs a component message in ETW to be caught. (Please not the isTracingEnabled checks to make sure string formatting will only be done when needed to have the minimum amount of performance overhead)

static void KeSae_ETW_LogComponentMessageTest(Args _args)
    CustTable   theCustomer;

        xClassTrace::logComponentMessage('AxTracing',"Starting ETW tracing");

    while select theCustomer
            xClassTrace::logComponentMessage('AxTracing',strFmt("Processing customer %1", theCustomer.AccountNum));

        xClassTrace::logComponentMessage('AxTracing',"Stopping ETW tracing");

Now that our code is ready to log some messages, start the data collector set in the perfmon tool and run the sample code. After running the code, go ahead and stop the data collector set. The result will be a log file generated that can be openened by the Event Viewer as seen below where we can actually see the logged messages from within Dynamics Ax.

So this is a very simple example, but this logging method is the way to go when you want efficient, performing logging in your system! This way you can minimize the logging overhead.

Business Operation Framework and multi-threading

Some of you will be familiar enough with Ax 2009 and therefore know how to create multiple threads when running batch jobs.
For those of you who aren’t : no worries! Since most of it will be done in the same way as before, you will be up to speed in no time.

The main difference in Ax 2012 is the Business Operation Framework that renders the RunBaseBatch framework kinda obsolete. (Read : It’s MS best practice to use BOF instead of RunBaseBatch). The BOF lets you create services and run these services in CIL. I will spare you the full details about CIL as it out of the scope of this article. You can find all the details about creating these services in some very nice posts of a colleague of mine.

Now that you have seen the basics, let’s get to the point of this article. Today we were wondering if the new BOF would also be able to handle multi-threaded batch processing and escpecially how it would be accomplished. Well here’s how…

In this example I will use a rather useless functionality but it’s done like this to keep things simple. I will have a set with some names in it and instead of running a service to display them all we will create two services :

  • The KeSaeBatchService will be used to run the batch job with and devide the work into smaller threads
  • The KeSaeRunTaskService will act as one of those threads running in batch

Creating the run task service

First thing to do is creating the KeSaeRunTaskService and creating a datacontract for it to contain a name it will be passed as a parameter. So start by creating the data contract.

public class KeSaeRunTaskDataContract
    Name mName;

public Name parmName(Name _name = mName)
    mName = _name;
    return mName;

Now that we have the contract, let’s create the service class. The service class just contains one method ‘process‘ that will be passed a data contract.

class KeSaeRunTaskService

public void process(KeSaeRunTaskDataContract _theContract)
    // Inside the runTimeTask we just print the name passed
    info(strFmt('%1', _theContract.parmName()));

Now create a service within the AOT and add the operation as seen below.

Creating the batch operation service

Now let’s do the same thing all over again, but for the service that will be submitted to the batch framework.
The only additional thing here is to create a menu item to that service to be able to run it.

And last but not least we need to put in some code in the batch service to create smaller runTimeTasks when processing in batch so let’s take a look at the process method.

public void process()
    Set             theNames = new Set(Types::String);
    SetEnumerator   theEnum;
    Name            theName;

    // Add some names to the set to process

    // Create the enumerator
    theEnum = theNames.getEnumerator();

    // Loop all the names
    while (theEnum.moveNext())
        // Get the next name
        theName = theEnum.current();

        // Create a service controller to run the task for processing one name
        mController = new SysOperationServiceController(classStr(KeSaeRunTaskService), methodStr(KeSaeRunTaskService, process));

        // Fetch the data contract from within the controller
        mContract = mController.getDataContractObject('_theContract');

        // Put the current name in the controller's data contract

        // Check if we are batch processing or not
                mBatchHeader = BatchHeader::getCurrentBatchHeader();

            // Create a runTimeTask within the current batch job
            mBatchHeader.addRuntimeTask(mController, this.getCurrentBatchTask().RecId);
            // Just run it immediately

    // If we're processing in batch, then save the batch header

This piece of code differs from Ax 2009 by constructing a SysOperationServiceController instead of a RunBaseBatch class to add as a runTimeTask. This works because the SysOperationServiceController extends the SysOperationController which in it’s turn implements Batchable.

class SysOperationServiceController extends SysOperationController
public abstract class SysOperationController extends Object implements Batchable

That’s about it! Do not forget to compile CIL! Then you should be seeing this when your service is being processed in batch.

And when clicking the parameters button on one of the threads, you can see the name that was passed in the data contract to the thread.

All this can be found in an XPO file available here.

SysOperationFramework : Field display order

Since the arrival of Dynamics Ax 2012, the traditional RunBaseBatch framework is getting obsolete. This has been replaced by the SysOperationFramework. This framework brings a few nice features that were missing in the RunBaseBatch framework. One them is the MVC pattern to separate concerns.

The basics

I would like to start with a few links concerning how to create SysOperationFramework services. A colleague of mine, Klaas Deforce, has created some posts that clearly explain how it works.

All of the post are linked through this overview post :

Field display order

Well now, let’s come to the question of this post : How can you determine the sequence of dialog fields on the dialogs created by the SysOperationFramework? The answer here lies in the Attributes available also in Dynamics Ax 2012. You have a contract with several datamembers marked with the [DataMemberAttribute] attribute. To determine the sequence you can add the [SysOperationDisplayOrderAttribute] attribute (

For an example of this, you can check the AssetBalanceReportColumnsContract class, method parmAssetBookId.

public AssetBookMergeId parmAssetBookId(AssetBookMergeId _assetBookId = assetBookId)
    assetBookId = _assetBookId;
    return assetBookId;

Groups and group sequence

In the previous sample of code you can also see the SysOperationGroupMemberAttribute attribute. This is to determine which fields belong to a certain group on a dialog. And you can also use a custom sequence on groups by using the SysOperationGroupAttribute attribute as seen in the classDeclaration:

    SysOperationGroupAttribute('Book', "@SYS95794", '1'),
    SysOperationGroupAttribute('Period', "@SYS40", '2')
public class AssetBalanceReportColumnsContract implements SysOperationValidatable
    boolean visibleFR;
    AssetBookMergeId assetBookId;
    ToDate closingDatePriorYear;
    ToDate closingDateThisYear;

So there you have it. If you want to rearrange things on the dialog, you can use the above method.


Propagate infolog messages to the windows event log

The windows event viewer can be a nice tool to check for messages dispatched by the system. You can save the logs in there, reopen them, different kinds of information is available so you can actually trace lots of things in there. But wouldn’t it be nice to also be able to log the messages thrown by Ax 2012 in the windows event log?

That way you do not lose user messages and they are nicely logged into the event viewer. It can also help to log messages received on a client that you cannot seem to reproduce, …

Well, it is possible and here is how to do it in a couple of steps:

  • Add a windows event log and source to put our specific infolog messages in
  • Edit the Ax32 config file to add an event log listener to the configuration

Create event log and source

So first things first, let’s create a windows event log by using the following powershell command

new-eventlog -logname "RealDolmen Ax Solutions" -source "Ax 2012 Infolog"

The result should be like in the figure below

Configure the listener

To add a listener, first open the ax32.exe.config file located in the clientbin directory. You should see a configuration similar to this:

<?xml version="1.0" encoding="utf-8" ?>
<startup useLegacyV2RuntimeActivationPolicy="true">
<supportedRuntime version="v4.0.30319" />
<requiredRuntime version="v4.0.30319" safemode="true"/>
<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
<probing privatePath="EditorComponents"/>

Modify the configuration so that it looks like this: (It is absolutely important to keep the source name !! The initializeData must be filled with the source you created in the event log)

<?xml version="1.0" encoding="utf-8" ?>
<trace autoflush="true"/>
<source name="Microsoft.Dynamics.Kernel.Client.DiagnosticLog-Infolog" switchValue="Information">
<!-- The initializeData contains the source that was linked to the created event log -->
<add name="EventLog"
initializeData="Ax 2012 Infolog"/>
<startup useLegacyV2RuntimeActivationPolicy="true">
<supportedRuntime version="v4.0.30319" />
<requiredRuntime version="v4.0.30319" safemode="true"/>
<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
<probing privatePath="EditorComponents"/>

Now we are all set up and when firing up the client, any messages to the infolog should be redirected to the event log. So let’s send some error lines to the infolog.

Now check if the same messages appear in the infolog. Normally this is what it should look like:

So there you have it. The messages are nicely logged in the event viewer. As a last remark, you can also adjust the logging level by modifying the switchvalue of the source. Off will not log anything at all, verbose will fill your event log with everything.

AX2012 Editor Extensions

Today I was reading a very interesting post about possible extensions to the editor in AX 2012 and I would very much like to share it.

Basically it explains that the editor is a hosted visual studio editor (that was already clear) and by knowing this, it is also possible to use the extensions that are available there to the editor inside Ax.
The post I read was dealing with the brace matching extension that is available to do automatic formatting of the braces.

You can read the full post here :