Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
rishirajdeb
Advocate I
Advocate I

Passing parameter to dataflow from Power BI desktop

Hi All,

In our project, we have a significant data volume & using a dataflow to load last 2 year data (~ 5 gb compressed). We intend to use the dataflow as a centralized ETL layer - different Power BI reports will be created consuming this dataflow, depending on business needs.

 

Please can someone help me with how to pass parameter from a power bi report to the dataflow, so that we can import only a subset of data relevant for that individual report? Anyway we cannot import the entire data from dataflow to dataset (as with power bi pro, datasets have the size limitation of 1 gb compressed).

 

Will highly appreciate any help on this regard!

 

Thanks,

Rishi

 

@selimovd @GilbertQ @lbendlin @collinq @amitchandak @v-luwang-msft @v-xuding-msft @ibarrau 

7 REPLIES 7
rishirajdeb
Advocate I
Advocate I

Thanks all for the responses!

 

I initially thought of passing parameters from the dataflow to the source and have the fact table splitted based on some category (ex - country) & consume those separate dataflow entities from power bi report based on requirement. That is of course still possible using ODBC connector (datasource - amazon redshift). But while trying the native redshift connector of Power BI (which has a significant edge in terms of performance over ODBC) - I found no option to pass a sql query while importing! Hence thought of confirming if there is any way to pass parameter to the flow itself from the report.

 

Anyone knows how to pass parameter from a dataflow to redshift using the native redshift connector 😊? Or I should create a separate post for that??

 

Thanks for your patience!

 

Regards,

Rishi

Think of it from the perspective of your report. Your report connects to the partitions of your dataset and to the DQ connections (if you have any).  So your data needs to be present in either.  The fact that your dataset feeds from a dataflow is not actually relevant.  Well, it means that instead of using a single stage ETL you have multiple stages. 

 

No matter what you do in the earlier stages, eventually your data has to reside in the partitions.  One way to improve partition performance would be incremental refresh. But like in SSAS you can decide yourself what the partitions should be.

Thanks @lbendlin for the response.

 

As per my understanding the primary use of dataflows are to maintain a single point of truth and reusability of data transformations - by separating the ETL layer from the data models (datasets). And that is the exact business requirement that we have!

 

Apart from that as I said, we need to be able to manage large volume of data with a pro license (data needs to be imported as direct query not suitable for our use cases). All data we are importing would change & there is no static part - so don't think incremental refresh would be of much use.

 

Thanks,

Rishi

"the primary use of dataflows are to maintain a single point of truth and reusability of data transformations"

 

Meh.  That's what your corporate data stewardship system/data warehouse is for (Azure Purview/CDS/Dataverse and its competitors).  Dataflows are not adequate for that - for example they have no guaranteed survivability.  You will have scenarios where your dataflow borks down and you have to start over.  Not something you want to experience with your data warehouse.

 

"All data we are importing would change & there is no static part"

 

Yes, that's a tough one. You need differential refresh and/or the equivalent of CDC.  The more people need this the more pressure we can exert on Microsoft to finally support it.

lbendlin
Super User
Super User

The whole point of a dataflow is that you can consume/import it as is.  DQ on a dataflow is lipstick on a pig - adding an intermediate SQL database layer that negates the advantage of Dataflows (semi structured blob storage) .

 

Have you considered incremental refresh instead?

parry2k
Super User
Super User

@rishirajdeb not sure if DQ will help. You can try going that route since dataflow allows DQ.

 

Follow us on LinkedIn

 

Check my latest blog post The Power of Using Calculation Groups with Inactive Relationships (Part 1) (perytus.com) I would  Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos to whoever helped to solve your problem. It is a token of appreciation!

 

Visit us at https://perytus.com, your one-stop-shop for Power BI-related projects/training/consultancy.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

selimovd
Super User
Super User

Hey @rishirajdeb ,

 

as far as I know, Power BI Dataflows don't support Query folding. This means you can just import the data flow and then filter for what you want to keep. 

 

If you need any help please let me know.
If I answered your question I would be happy if you could mark my post as a solution ✔️ and give it a thumbs up 👍
 
Best regards
Denis
 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.