Data Migration Framework Hands-on

Architecture of the DMF

The Data Migration Framework uses a three-step process to import data into Dynamics AX.

  • Source files are imported into staging tables
  • Data is transferred from the stating tables to the target tables
  • Staging table cleanup

The import of source files is handled by SQL Server using DTSX (Data Transformation Services) packages. DTSX packages running on SQL Server can import the source data in the Dynamics AX database much faster than any other means.

Once imported in AX, data can be transferred from the staging tables to the target tables. This can be done in batch jobs to handle larger data sets.

The different stages allow us to control the data flow and adjust data in the staging tables. Data is imported quickly in Dynamics AX and from then on we can handle the data without any dependencies outside Dynamics AX.

Source data formats

Formats define the parameters for reading the source files. You can setup the delimiters and code pages used to read the file. For now, only the File type is available so you can choose between Delimited and Fixed Length files.

Behind the Enum field, there are other Types available that suggest the ability to transfer data from on Dynamics AX environment to another. But for now, only the type File is supported.

Target entities

Go to Data Migration Framework > Setup > Target entities.

There are some target entities available out of the box. Target entities describe a grouping of tables that define a logical entity. For example: The product entity defines a product and therefor contains all of the tables related to a Product : InventTable, InventTableModule, …

Target entities can be used to define the following:

  • The staging table for the entity
  • The entity class that is linked to the entity where you can put addition logic
  • The target entity class (which is a query representing the data tables to target)
  • The mapping between the staging data and the target entity

For each entity, you can view the structure of the entity. This contains all of the physical tables that make up the entity.

The target mapping is generated, but can always be validated and changed if specific requirements are in place. There is also a possibility to create functions to use in the mapping. Note that this is the mapping between the staging table and the Target Dynamics Ax UnitOfMeasurement table.

Processing groups

Group definition

Processing groups are used to group entities together for import.

Processing Group Entities

For each processing group, you can define the entities involved with this group.

For all of the entities you can define the following:

  • The source data format to use
  • The file path of the file to use
  • Whether or not to perform insert or doInsert
  • Whether or not to perform record validation upon insert

Source file

Concerning the source file, you have to options:

  • When you have a source file
    Then you can just go ahead and select the source file and the format to be used
  • When you need to create a source file
    Here you need to click Generate source fileto launch a wizard that will guide you through the steps to generate a source file.For the above setup, the Generate sample file button will create the following file: (Here the data is already filled in, but the wizard creates the headers.
    Source mapping

When you have an processing group entity selected, click the generate source mapping to create a default mapping and then click modify source mapping to view the mapping of the source file to the staging table in Dynamics AX.


Preview source file

Once the source file is selected and the source mapping is done, we can preview the contents of the file by clicking the Preview source file button.

Import staging data

On the processing group form, click the Get staging data button. A dialog will appear showing the ID and Description of the processing job that will be created. Fill in a description and click ok.

Following the dialog, the Staging data execution
form appears. This form contains details for all of the entities contained in the processing group such as : start time, end time and status.

From here on, you can choose to run the job immediately or to run it on the AOS in batch. So, depending on what you want to do here, click the Run or the Run on AOS button.

After running the job, an infolog is shown informing how many records were inserted in the staging table(s).

To view the staging data, refer to the paragraph explaining the Execution History.

Copy data to target

Back on the processing group form. Click the Copy data to target button. This opens up a dialog where you can specify a job to process. The selected job is the job that created the staging data. It is this staging data that we are going to transfer from the staging table(s) to the target table(s).

You can also choose to run the data copy for all of the records or for the records specified by the criteria:

  • Records that were already processed with errors
  • Records that were selected by the user

Click the OK button and the Target data execution
form appears. This is a similar form as the source data execution form, but here the information is about data being transferred to the target table(s).

Here you can also choose to run it immediately or execute this on the AOS in batch. The result should be an infolog informing about the number of records written to the target table(s).

Execution History

On the processing group form, click the Execution History button to view the history for the selected group. This will open up the history form which shows the following information:

  • The Jobs that have been created for the entities contained in the processing group
  • Staging and target status
  • Staging and target details (Start time, end time, number of records, …)

To view the staging data after importing data into the staging table(s), you can use the View staging data button. It opens a form for the staging table and show the records. Here you can do the following:

  • Modify staging data
  • Validate the data against the target table
  • Select records to filter on when copying data to the target tables
  • Use the target button to open the destination form (In this case the Unit of measurement form)

Back on the Execution History form, you can use the Log button to view the logging details for all the previous steps.

4 thoughts on “Data Migration Framework Hands-on

  1. Hello Kenny;
    Thanks for an informative write-up. I have an issue where sometimes after I generate the source mapping successfully, I click the Modify source mapping button, but nothing happens. I get an hourglass for a second or two and then nothing – no window with the maps. Any idea what would prevent the modify source mapping window from opening? Thanks!

  2. Hi Craig,

    I would like to help you out, but I did not experience that behavior until now. Is there a way I can reproduce this? (A certain case in standard DMF that causes the same issue?)


  3. Hi Kenny

    Thank you very much for responding – I appreciate that!

    It turns out that there were compile issues during the install – so I suspect that was the issue. I’ll know more soon after more testing.

    On a side note, I am just starting to work with the Custom Entity Wizard for two purposes – one purpose is to migrate data into table where we have added VAR fields to a standard table and the second purpose is to migrate data into custom VAR tables. It would be great to hear if you have any thoughts or insights about these situations!

    Thanks again!

  4. Hi ,
    I am new to DMF , Can you please help me on the below issue.
    I have a Tab Delimeted file which as the records of HcmJob and HcmJobDetail Tables .
    now i need to do a Data migration which copy to target table for both HcmJob and HcmJobDetail
    how to do this? should i have to write a generat methd in DMF Class?
    If Yes, what will be the code because we also need to populate the one more filed (job) in HcmJobDetail which store the RecId of HcmJob.
    Example :if there are 6 field in ext file , In which 2 field belongs to HcmJob and 2 field belong to HcmJobDetail and other 2 is not used.
    So .. when i insert the record in HcmJob then the recId of that record is store in HcmJobDetail along with the 2 field present in text file.
    Can you please let me know the steps to do the above.


Leave a Reply

Your email address will not be published. Required fields are marked *