Introduction
In this article, you will learn to build a data pipeline to move a Sample dataset to the Lakehouse. This experience shows you a quick demo about how to use pipeline copy activity and how to load data into Lakehouse.
Prerequisites
To get started, you must complete the following prerequisites.
Create a data pipeline
Step 1. Login to Microsoft Fabric.
Step 2. Select Power BI.
Step 3. Select your Workspace.
Step 4. Select Data Pipelin.
Step 5. Assign a name to the Pipeline (Demo in my example) and click create.
Step 6. Now you should see a screen similar to the one below.
Step 7. Now click "Copy data"
Step 8. You will see some sample data as shown below.
Step 9. Select Diabetes data and click Next.
Step 10. See the data and click Next.
Step 11. Select Lakhouse in the choose destination step.
Step 12. Select the Lakehouse. If you don't have one, select the Create New Lakehouse Option.
Step 13. You can check the details like table name, etc, and click Next.
Step 14. Review, then Click Save+Run as shown below.
Step 15. The Copy activity is added to your new data pipeline canvas. All settings, including advanced settings for the activity, are available in the tabs below the pipeline canvas when the created Copy data activity is selected.
Step 16. Click the Activity Name to learn more details.
Finally, you have created a copy of the data to Lakehouse and executed it.
Conclusion
You can also schedule the pipeline to run with a specific frequency as required. Below is an example of scheduling the pipeline to run every 15 minutes. You can create this pipeline in under 10 minutes based on the size of the data,