Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

determine load order of tables

Hello,

 

I have a Power BI report with tables that, upon refresh, trigger stored procedures. Those stored procedures first insert query results into a table, then insert a record of being triggered into a kind of metadata table, and finally return the first mentioned table as a source to Power BI. This works in this way because data has to be refreshed manually upon a signal instead of upon a schedule, and by designing the procedure/report in this way, the extra step of having to wait on SQL Server is omitted. While the full process of creating this report might not be considered optimal, I'm not in the power to change anything about it.

 

The problem is the following. Not only do I want the results of the 8 tables that are being filled by stored procedures, I also want to load in the metadata table. Report users want to see history, so the metadata is necessary in order to create the slicer - not in the least because some procedures usually don't return any rows, and the metadata table is the only place that includes 'evidence' of the procedure having run. This table, however, only includes all data when all procedures have fully run, and therefore needs to load last.

I have disabled parallel loading in the settings, and dragged the metadata table to the bottom of the sources list on the left of the Power Query Editor. This seems to do the trick. However, I don't find any evidence that dragging the table to the bottom of the list will ensure that it loads last. Is this the case? Also, there is no reason for most queries to run in series, so this solution is unnecessarily prolonging refresh times.

 

Ideally, I would like to have some certainty, for instance with an M-function like Function.LoadAfter([table name]). Function.InvokeAfter() does not do the trick, since I don't want to hardcode durations. If refresh suddenly takes longer than expected, the table will load wrongly after all. Another solution would be to be able to manually adjust query dependencies. I found the query dependencies schema under the 'View' tab, but it shows dependencies 'as is' instead of being able to adjust them. This also gave me an idea of perhaps creating true/false parameters for each table which are false but turn true as soon as the corresponding table has been fetched. Then the metadata table would run as soon as all parameters have been set to 'true', therefore manually creating dependencies, but I don't think anything of the like is possible (and frankly it would also be a very cumbersome solution).

 

So, in short, I seem to have a solution for forcing a load order on tables, but I'm not certain whether it is guaranteed to work every time, and it is not ideal since all tables have to load in series. However, all other kind of solutions I think of (Power Query functions, manually determining load orders) hit a dead end. Any suggestions?

 

Thanks in advance!

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Ultimately I ended up discussing all associated difficulties with our solution to the process architects, and decided to change the process a little bit after all, since it became far too complicated for what it actually is. The user will now execute an overarching procedure in SQL Server first, and then load the tables with results into Power BI. While being slightly less user friendly by adding an intermediary step, this will be a far less complicated - and therefore more reliable - process.

 

So not really a solution to get the above done as I wanted to, but a solution to omit the problem at all. 

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

Ultimately I ended up discussing all associated difficulties with our solution to the process architects, and decided to change the process a little bit after all, since it became far too complicated for what it actually is. The user will now execute an overarching procedure in SQL Server first, and then load the tables with results into Power BI. While being slightly less user friendly by adding an intermediary step, this will be a far less complicated - and therefore more reliable - process.

 

So not really a solution to get the above done as I wanted to, but a solution to omit the problem at all. 

lbendlin
Super User
Super User

I have disabled parallel loading in the settings

Yeah. That does nothing.  This setting is ignored on the Power BI service.  Longstanding bug.

 

Why don't you use the metadata table refresh as the trigger and then let Power Query load tables however it wants?

 

Alternatively, consider using the Enhanced Dataset Refresh API or the XMLA endpoint queries to selectively refresh individual tables and/or partitions. 

Anonymous
Not applicable

Thank you, I'll have a look at the API and XMLA endpoint. What do you mean by using the metadata table refresh as the trigger, since this one needs to load last in order to contain all relevant information?

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors