top of page
  • Ben Watt

Using Power Automate to run DAX queries against your Power BI dataset

This week the Power BI team announced an awesome new feature. It's the ability to run DAX queries against your Power BI datasets via a REST API. Wow, wow, wow!


Given that coding API's might not be something everyone has the ability to do, I thought I'd share how you can use Power Automate to run a DAX query, extract data & save it to a file.


Below is a high level summary of the Flow I built. From this example you can tweak it in a thousand different ways to suit your own scenarios using connectors like Excel, Teams, OneDrive, Azure Storage, Dropbox, Jira, and literally hundreds of other variations.

  1. Obtain a Token to authenticate to the Power BI REST APIs

  2. Call the new Power BI API, passing a DAX query

  3. Parse the response and convert to a CSV table

  4. Save the result as a CSV file in SharePoint

A Power Automate, API, Power BI, SharePoint disco


Getting an Access Token


From the screenshots below, you'll see I was a very good citizen and used Azure key Vault to store my credentials, and thus not require plain text storage of sensitive data in the flow.

The first few steps


The client credentials are obtained by creating an Azure App Registration. You can see how to create one of these in another blog I wrote here


The Microsoft authentication endpoint gives back a token which I use later on. You can see in the Body part of the screenshot below I used "client_credentials" which means I did not need to tie the authentication to a users account, as I was using the Service Principal option. This is much better for automation activities.


Just to add, Key Vault and HTTP are both premium actions in Power Automate, so you'll need the relevant licence to build this.

The HTTP action to grab a token


The next steps parse the response and save the token type and token to a new variable to be used later. I also create a local variable which contains the DAX query to pass to the API.

Parsing the results & setting a variable with the token data


The API takes a DAX query which is executed against the dataset. I would suggest that you test out your query using a tool like DAX Studio to ensure it is returning the data you want. If you haven't created DAX queries before, I suggest creating a table in Power BI Desktop that matches the output you want and using the built-in Query Analyzer where you can then simply copy the query.

Get your DAX query in Query Analyzer


Call the new API


The prep work is done, now we just pass it to the new API. Not much else to it really!

Call the new API, pass some DAX, get some data!


Parse the API response & convert to CSV


The API passes it's result set back in JSON format, which I'm parsing and converting to a flat table. The response body is made up of arrays which I assume is designed to support future capability of passing multiple DAX queries.


Example result set


You get an array of rows, within an array of tables, within an array of results. Currently, the actual results set is contained in the first item in the list of results, and the first item in the list of tables. I used a Flow expression to grab the first table and results array item, which looks like this:


first(first(body('Parse_DAX_Response')?['results'])['tables'])['rows']

The next step is a simple Compose action which takes this array and switches to a CSV table.


Save the results to a file in SharePoint


Lastly, I take this and save to a file in SharePoint, which is time stamped, using this expression:


formatDateTime(utcNow(), 'yyyyMMdd_HHmmss')

The example I have exports 9 rows of data to the CSV file and takes approximately 2-4 seconds to run. Not too bad really to authenticate, execute a query, convert results & save to a file.

Screenshot of the results saved to SharePoint


Wrap up


This API opens up amazing opportunities to integrate procured analytical data with other apps in your organisation or externally. I have 3-4 immediate use cases at my customers, however I'll hold off until the public preview moves to GA (General Availability). I'd love to hear your use cases!


There is a limit of 100,000 rows on this API. In my opinion if you are extracting anything over even 10,000 rows it should be as a last resort, having exhausted other options to get that data. OLAP models are built for aggregation type queries, not large volume granular queries which are more suited for relational databases.


Links







bottom of page