Export Transform Import Data in Dataverse PowerPlatform Dynamics

Introduction

Transferring data between different Dataverse environments can be crucial for various reasons, such as testing, development, or migration. We used to Configuration Migration Tool for Migrating the Data by exporting based on schema and importing to Target Environments. Azure Data Factory (ADF) is a powerful tool that enables you to perform Transformation of data as well and can import data to a lot of other external data sources as well. This article will guide you through the process of exporting data from one Dataverse environment, performing Transformation, and importing it into another using ADF.

Prerequisites

Before you begin, ensure you have the following.

  • Access to the source and target Dataverse/Power Platform environments.
  • An Azure subscription with permissions to create and manage Azure Data Factory.
  • Basic understanding of Azure Data Factory and Dataverse.

Advantages of Using Azure Data Factory Over Configuration Migration Tool

While the Configuration Migration Tool is a useful tool for moving configuration data between Dataverse environments, Azure Data Factory offers several advantages:

1. Scalability

  • ADF: Can handle large datasets efficiently with built-in support for parallelism and scaling out across multiple nodes.
  • Configuration Migration Tool: Primarily designed for smaller datasets and configuration data, may not perform well with large volumes of data.

2. Automation

  • ADF: Allows for complete automation of data movement tasks through scheduling, triggers, and integration with other Azure services.
  • Configuration Migration Tool: Typically requires manual intervention for each data transfer operation, making it less suitable for regular, automated tasks.

3. Flexibility

  • ADF: Supports a wide range of data sources and destinations, allowing for complex ETL (Extract, Transform, Load) operations across various data stores.
  • Configuration Migration Tool: Limited to Dataverse and specific data configurations, lacking the flexibility for broader ETL processes.

4. Monitoring and Management

  • ADF: Provides extensive monitoring and logging capabilities, enabling detailed tracking of data transfer processes and easier troubleshooting.
  • Configuration Migration Tool: Limited monitoring capabilities, making it harder to diagnose and resolve issues.

5. Integration

  • ADF: Seamlessly integrates with other Azure services such as Azure Functions, Logic Apps, and Azure SQL Database, providing a comprehensive data integration solution.
  • Configuration Migration Tool: Designed specifically for Dataverse, with limited integration capabilities outside this scope.

6. Cost Efficiency

  • ADF: Pay-as-you-go pricing model based on data volume and activity, potentially offering cost savings for large and regular data transfers.
  • Configuration Migration Tool: No direct costs, but potential inefficiencies in handling large data sets could lead to indirect costs in terms of time and resources.

Step-by-Step Guide


Step 1. Create a Data Factory Instance

  1. Go to the Azure portal.
  2. Navigate to Create a Resource> Integration > Data Factory.
  3. Fill in the required details such as subscription, resource group, and region.
  4. Click Create.
    Basics
  5. Once created, Go to ADF -> Launch Studio
    Launch studio

Step 2. Create Linked Services

Linked services define the connection information needed for ADF to connect to external resources.

  1. Create Linked Service for Source Dataverse
    • In ADF, go to Manage > Linked Services.
    • Click + New and select Microsoft Dataverse. and Search with "Dynamics CRM"
      Manage
    • Configure the connection with the source Dataverse environment's URL and authentication details.
      Linked services
    • Test the connection and click Create.
  2. Create Linked Service for Target Dataverse.
    • Repeat the above steps to create a linked service for the target Dataverse environment.

Step 3. Create Datasets

Datasets represent the data structures within the linked services.

  1. Create a Dataset for Source Dataverse
    • Go to Author > Datasets.
    • Click + New and select Microsoft Dataverse.
    • Configure the dataset to point to the linked service of the source Dataverse.
    • Define the table you want to export.
  2. Create a Dataset for Target Dataverse
    • Repeat the above steps to create a dataset for the target Dataverse environment.

Step 4. Create Data Flow


Add Source to the Flow

Point the Dataset to the Source Environment Dataset(Account Entity)

SQL account

Add Filter Row Transformation

Add Filter Transformation to the Flow, which will retrieve the records which are created on or after yesterday. (Here we can add any complex Filter as well)

Add filter row transformation

Add Required Column Transformation

Add Select Transformation, which will filter only the required fields from the table

Add requirement column transformation

Add Destination

Add Destination to the flow, and point the dataset to the Target Environment

Add destination

We can add other Destinations as well to copy the data. For Example. Azure Data Storage Container

Azure Data Storage container

Step 5. Create a Pipeline

Pipelines are data-driven workflows in ADF.

Create a New Pipeline

  • Go to Author > Pipelines.
  • Click + New pipeline and name it appropriately.
  • Add the created Data Flow Activity to the pipeline
    Create data flow activity

Step 6. Trigger the Pipeline


Add a Trigger

  • Click Add trigger > New/Edit.
  • Define the trigger to run the pipeline on a schedule or manually.

Run the Pipeline

  • Save and publish the pipeline.
  • Trigger the pipeline manually or wait for the scheduled time.

Monitor Pipeline Run

  • Go to Monitor > Pipeline runs to check the status of your pipeline.
  • Review any errors and troubleshoot as necessary.

The Pipeline Run will be as below.

Pipeline run

Don't forget to validate the data in the Target Environment in Power Platform/Dynamics CRM

Conclusion

By following these steps, you can successfully export data from one Dataverse environment and import it into another using Azure Data Factory. This method is scalable, flexible, and can be automated, making it ideal for various data transfer scenarios. Azure Data Factory provides a comprehensive solution for data integration needs, leveraging the power of the cloud to facilitate seamless data movement.

For many organizations, the advantages of Azure Data Factory in terms of scalability, automation, and integration outweigh the simplicity of the Configuration Migration Tool, making it a preferred choice for robust data migration strategies.


Similar Articles