Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Johann_32
Advocate II
Advocate II

Parameters in Dataflows

Hello,

 

we are in the start up phase of Power BI in our company and plan to provide standardized Dataflows to our Power BI users, coming from IT side. This is a key part of our governance strategy to offer an IT-Governed Self-Service.

 

Those Dataflows are then used by our Power Users & Designers to create their own individual datasets out of it to publish it in the Power BI service and collaborate with each other.

 

This means, our dataflows should be very high level with only few logic in it to enable users to pick from a large pool of data.

 

Now, nevertheless we wanted to restrict the amount of data in our dataflows a little bit by using parameters, because it is simply too much data.

For example, we tried to implement a parameter that is called time_range in Power Query Online which will then only show data of the corresponding time frame. 

 

Problem:

It seems that it is not possible for the user to influence this parameter setting when consuming the dataflow in Power BI desktop. This means, if we set the dataflow to last two years the user will only see last two years and can never see anything else. Our thinking was that we define this parameter once in the dataflow and as soon as the user consumes the dataflows, there is kind of a prompt where he can input exactly what 2 years of range (from / to) he needs.

 

In datasets, there is the possiblity in the settings (Power BI service) to change the parameter comfortably. It seems this is not how dataflows should be used.

 

Is the only way to achieve the target behavior by really storing all of the available data in the dataflow (without using any parameters) and then later on create parameters in Power BI desktop before uploading the dataset to Power BI service?

 

Or is it really necessary to create X dataflows (duplicates) and each of them having a slightly different parameter in it? (A lot of maintenance overhead)

 

Are we missing some functionality here?

 

Thanks in advance.

 

Best Regards,

 

Johann

 

 

1 ACCEPTED SOLUTION
selimovd
Super User
Super User

Hey @Johann_32 ,

 

a data flow is usually refreshed a few times a day. After that refresh the data is saved in Azure Datalake Gen 2 and when you connect to the data flow the data is taken from the data lake.

So if you load 2 years in the refresh, then the data flow contains 2 years. When you connect with Power BI you connect to the refreshed data in the data lake.

 

I personally would load the whole data to the data flow, then you can still limit it in the Power BI report with a parameter.  Would that work for you?

 

If you need any help please let me know.
If I answered your question I would be happy if you could mark my post as a solution ✔️ and give it a thumbs up 👍
 
Best regards
Denis
 

View solution in original post

4 REPLIES 4
Johann_32
Advocate II
Advocate II

Hello Denis,

 

thanks a lot for the quick response. 

 

I agree to load whole data into the dataflow and limiting then in Power BI Desktop, depending on the needs of the user.

 

Still I hope what we try to do as IT, providing dataflows for our Power BI users, makes sense.

Or, if we are taking too much governance and responsibility here on IT side, which makes the tool no longer "self-service".

 

I haven't seen this approach in any governance whitepaper or in any community discussion. In general we would like to avoid users to consume non-standardized data which they later on compare with our data warehouse enterprise reporting solution.

 

 

Thank you

 

selimovd
Super User
Super User

Hey @Johann_32 ,

 

a data flow is usually refreshed a few times a day. After that refresh the data is saved in Azure Datalake Gen 2 and when you connect to the data flow the data is taken from the data lake.

So if you load 2 years in the refresh, then the data flow contains 2 years. When you connect with Power BI you connect to the refreshed data in the data lake.

 

I personally would load the whole data to the data flow, then you can still limit it in the Power BI report with a parameter.  Would that work for you?

 

If you need any help please let me know.
If I answered your question I would be happy if you could mark my post as a solution ✔️ and give it a thumbs up 👍
 
Best regards
Denis
 

Hi @selimovd 

 

Hope you don't mind me jumping on this question, but you mention "I personally would load the whole data to the data flow, then you can still limit it in the Power BI report with a parameter."  That would be what we would want to do, but I can't see anywhere how towrite the syntax to use a parameter within power query using a dataflow as a data source?  Is this still possible?

 

Thanks,

Elliott

Hey @elliottriley ,

 

sure, Chris Webb wrote an article how to do that a few years ago.

I would implement it like it was written in the article. Check it out:

https://blog.crossjoin.co.uk/2020/05/24/limit-the-amount-of-data-you-work-with-in-power-bi-desktop-u...

 

If you need any help please let me know.
If I answered your question I would be happy if you could mark my post as a solution ✔️ and give it a thumbs up 👍

Best regards
Denis

Blog: WhatTheFact.bi
Follow me: twitter.com/DenSelimovic

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors