How does Power BI determines the order of queries loaded in the data model? Is it determined by the order of sources you get?
Solved! Go to Solution.
I was thinking the exact same thing.
I have a model that has evolved over time. It now takes about 10min to load all the data.
One of the tables is a single row, Last Refresh table, that reports on the currency of the data.
Being small, this table is always one of the first to load, and that means the Last Refresh time stamp is always a little bit out.
It's not a big issue, but it would be nice if there was some way to flag that this table should load last.
In the following video there is a possible solution:
After disabiling the concurrent refresh ([!] I have not tested the effect upon performance of doing this), next you can add an M Code function function.InvokeAfter in the advanced editor.
I think that you could set it up where the function that is called "daisy chains" through the load order you're interested in.
However there's probably a more elegant solution. If I find it, then I'll check back... because I'd like to add a table for load duration, in addition to the load time through
or, in my case, the pl/SQL:
TO_CHAR(SYSDATE, 'DDMONYYYY HH24:MI:SS') "Retrieved"
Unrelated, kind of (but required sequential loads)... I wanted to calculate data load time durations.
Here is a not-so-elegant solution I just used in 3 steps:
1) Get a start time (from your data or a table with DateTime.LocalNow())
2) invoke a simple function which uses the slowest data set
3) Get an end time - use DateTime.LocalNow() in a table which is dependent upon steps 1 and 2.
1) Getting a Refresh Start time table loaded:
My first step was that I loaded a table which refreshes immediately upon connection (this was done using an SQL statement in the above post to get the time from the queried database. You could use DateTime.LocalNow() instead, though.)
2) invoke a function to call a reference to your slowest data set.
(no closed paretheses. Choose a simple aggregate function like count or max)
(Open the function)
(invoke the data you want to measure load times for)
3) load the invoked function into a table to calculate the duration of the load:
(convert to table)
(Load the start time)
(Get the DateTime.LocalNow() time after the invoked function is loaded)
Calculate the duration
I timed my query with and without the duration measurement and consistently got a time of 72 or 73 seconds
Trial 1 (with): 20s to connect 93s to load (73s)
Trial 2 (with): 14s to connect 86s to load (72s)
Trial 3 (with just evaluating): 7s to evaluate 80s to load (73s)
Trial 4 (without): 13s to connect 86s to load (73s)
The technique doesn't work if you load the data-for-duration-calculation directly into the table doing the calculation: you need to invoke a function Then load the data into the aggreation. When invoking the function use something simple like Table.RowCount [not Table.RowCount(Your Data)... that loads it directly and allows Power Query to use paralell processing by creating a reference.]
It's not an elegant solution.
Chris Webb has a series of blog posts regarding the Diagnostic function... which is great if you can pour through your appdata log files... but I have no idea how you calculate load duration in the PowerBI webservice where you don't have the log files.
Maybe there is a better solution out there. If anyone finds it, or if the Power BI Team makes a tool in the service for it, where it can populate within a dashboard - please let the community know.
My next approach would be to use some Power Automate magic, but I think I'll call it a night.
Learn how to create your own user groups today!
Click here to read more about the November 2021 Updates!
Join us, in-person, December 7–9 in Las Vegas, for the largest gathering of the Microsoft community in the world.