Create Your First Pipeline to Copy Data to Lakehouse

Introduction

In this article, you will learn to build a data pipeline to move a Sample dataset to the Lakehouse. This experience shows you a quick demo about how to use pipeline copy activity and how to load data into Lakehouse.

Prerequisites

To get started, you must complete the following prerequisites.

Create a data pipeline

Step 1. Login to Microsoft Fabric.

Microsoft Febric

Step 2. Select Power BI.

Power BI

Step 3. Select your Workspace.

Workspace

Step 4. Select Data Pipelin.

Data pipeline

Step 5. Assign a name to the Pipeline (Demo in my example) and click create.

Demo

Step 6. Now you should see a screen similar to the one below.

Screen

Step 7. Now click "Copy data"

Copy data

Step 8. You will see some sample data as shown below.

Sample

Step 9. Select Diabetes data and click Next.

Data source

Step 10. See the data and click Next.

Next

Step 11. Select Lakhouse in the choose destination step.

Select destination

Step 12. Select the Lakehouse. If you don't have one, select the Create New Lakehouse Option.

Lakehouse

Step 13. You can check the details like table name, etc, and click Next.

Details

Step 14. Review, then Click Save+Run as shown below.

Copy data

Step 15. The Copy activity is added to your new data pipeline canvas. All settings, including advanced settings for the activity, are available in the tabs below the pipeline canvas when the created Copy data activity is selected.

Selected

Step 16. Click the Activity Name to learn more details.

Activity

Finally, you have created a copy of the data to Lakehouse and executed it. 

Conclusion

You can also schedule the pipeline to run with a specific frequency as required. Below is an example of scheduling the pipeline to run every 15 minutes. You can create this pipeline in under 10 minutes based on the size of the data,