Problem Statement
It is possible to copy / Extract data from Power BI dataset from Azure Offerings as stated in the blog : Overcoming Data Size / Row Limitations of Power BI REST API for Automated Data Extraction from Dataset by leveraging the Power BI REST API: Datasets – Execute Queries.
But unlike Power BI Dataset , Power BI dataflow doesn't have any native API for the data extraction purpose. So in order to extract data from t
Is it possible to Copy / Extract data from Azure Analysis service with MFA enabled on AD accounts in an automated way .
Prerequisites
- MSFT Fabric Dataflow Gen 2
- Power BI Dataflow
Solution
1. Login to https://app.fabric.microsoft.com/home?experience=data-factory and Select Dataflow Gen 2.
And rename the Dataflow.
2. Click 'Get Data' >> More and Select Dataflows.
3. Create a new connection (In case if there isn't an existing one) via your organization account and Click 'Next'.
4. The list of Dataflows to which the organization account has access to would be populated under Workspaces section as seen below.
Filter for your Dataflow Name and Select the Table whose data we need to extract.
5. Do the necessary transformation that you need in the Power Query mode.
In this scenario, we would add a new column called AuditDate for auditing purpose.
Final Source transformations state.
The Advanced editor code.
let
Source = PowerPlatform.Dataflows([]),
#"Navigation 1" = Source{[Id = "Workspaces"]}[Data],
#"Navigation 2" = #"Navigation 1"{[workspaceId = "0bca1820-43e9-49dc-a3e1-82e505dbdae7"]}[Data],
#"Navigation 3" = #"Navigation 2"{[dataflowId = "bfd00ae0-cf60-4771-a041-289142951ea5"]}[Data],
#"Navigation 4" = #"Navigation 3"{[entity = "DateDim", version = ""]}[Data],
#"Added custom" = Table.TransformColumnTypes(
Table.AddColumn(#"Navigation 4", "AuditDate", each DateTime.LocalNow()),
{{"AuditDate", type date}}
)
in
#"Added custom"
6. Now Select the Sink / Destination settings
As of today only 4 Sinks are supported :
So for our use case, we would Select Azure SQL Database.
7. Similar to #3, create a connection to Azure SQL Database.
8. You can either create a new table on run or map it to an existing one.
In our use case, we would create a new table in destination with the name DateDim.
9. One can either append the data or replace the data based on the settings.
We would proceed with Append data scenario.
And Click on “Save settings”.
10. Finally Click on “Publish”
11. The Dataflow begins the 1st execution once it’s published.
Database output
12. To Execute again, Click on the Refresh now component of the dataflow.
As we have enabled Append data at sink, we would have 2 distinct AuditDates
13. In order to schedule the data extraction, one can either schedule the dataflow refresh frequency via Dataflow Settings.
OR schedule via Fabric Data pipelines.