Copy Data from Azure Analysis Service (AAS) through Microsoft Fabric

Problem Statement

It is possible to copy data from Azure Analysis services from Azure Data Factory (ADF) as stated in the blog : Copy Data from SSAS/AAS through Azure Data Factory .

But this requires us to create a Linked Server via a SQL server and unfortunately in case if MFA is enabled on an Account which is being leveraged for creation of Linked Server to AAS ; one cannot create a linked server for automated jobs (and would only be able to create on an interactive basis)

Is it possible to Copy / Extract data from Azure Analysis service with MFA enabled on AD accounts in an automated way.

Prerequisites

  1. MSFT Fabric Dataflow Gen 2
  2. AAS Tabular Model

Solution

  1. Login to https://app.fabric.microsoft.com/home?experience=data-factory and Select Dataflow Gen 2.
    Login to Microsoft Azure
    And rename the Dataflow.
  2. Search for Azure Analysis Services components and Select it
    Data source
  3. Provide the Server and the Database details and under Advanced Options : input the DAX query ( the output which you need to copy from AAS).
    Connect data source
    Then create a connection to the AAS via the Organization account and sign in and Click 'Next'
    Connection credenctials
  4. Do the necessary transformation that you need in the Power Query mode.
    In this scenario, we would add a new column called AuditDate for auditing purpose.
    Custom column

Final Source transformations state.

Final source

The Advanced editor code.

Editor

let
    Source = AnalysisServices.Database("<<ServerName>>", "<<DatabaseName>>", [Query = "EVALUATE SUMMARIZECOLUMNS('Dates'[Fiscal Year], ""Sold USD"", [Sold USD])"]),
    #"Added custom" = Table.TransformColumnTypes(
        Table.AddColumn(Source, "AuditDate", each DateTime.LocalNow()), 
        {{"AuditDate", type datetime}}
    )
in
    #"Added custom"

Now Select the Sink / Destination settings.

Data destination

As of today only 4 Sinks are supported.

So for our use case, we would Select Azure SQL Database.

Similar to #3, create a connection to Azure SQL Database

Data destination

You can either create a new table on run or map it to an existing one.

Target

In our use case, we would create a new table in destination with the name AASTransform.

One can either append the data or replace the data based on the settings.

Append

We would proceed with Append data scenario.

And Click on "Save settings".

Finally Click on "Publish"

Publish

The Dataflow begins the 1st execution once it's published.

Database output

Database output

To Execute again, Click on the Refresh now component of the dataflow.

Dataflow

As we have enabled Append data at sink, we would have 2 distinct AuditDates

Audit date

12. In order to schedule the data extraction, one can either schedule the dataflow refresh frequency via Dataflow Settings:

Copy AAS

OR schedule via Fabric Data pipelines.


Similar Articles