Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi ,
I have a power BI reports uses the imported data frrom MDX cube using a query. It have only 2 million records but it takes more than 20 mins to refresh and failes with time out error as we have 20 min limit on cube.
What could be reason and solution please.
Thanks,
R
Solved! Go to Solution.
Hi @Anonymous
What about the refresh performance on the desktop? if the refresh works much faster than service,
1. please kindly check your Gateway server, make sure that it has got enough memory, CPU and bandwidth to both the source data and Power BI service.
2. You should take into account that refresh also needs/consume part of the capacity you own.
Please check what else happening in the same time that takes over the capacity.
A solution could be to schedule the refresh in non-rush hours or queue the refreshes for the night.
3. Is there any advanced operation in your query tables?(e.g combine, append, reference other query, calculate with external query table, custom function)
If this is a case, it will cause additional cost on calculating with these reference queries.(each row will loop calculation with reference table, it will increase memory usage and calculation time)
Maybe you can try to use List.Buffer or table.Buffer to cache these reference query to memory to reduce additional resource spend.
Reference link:
How to Improve Query Reference performance for large tables
Use of Table.Buffer in references
Also the official documents Power BI performance best practices
Hi @Anonymous
What about the refresh performance on the desktop? if the refresh works much faster than service,
1. please kindly check your Gateway server, make sure that it has got enough memory, CPU and bandwidth to both the source data and Power BI service.
2. You should take into account that refresh also needs/consume part of the capacity you own.
Please check what else happening in the same time that takes over the capacity.
A solution could be to schedule the refresh in non-rush hours or queue the refreshes for the night.
3. Is there any advanced operation in your query tables?(e.g combine, append, reference other query, calculate with external query table, custom function)
If this is a case, it will cause additional cost on calculating with these reference queries.(each row will loop calculation with reference table, it will increase memory usage and calculation time)
Maybe you can try to use List.Buffer or table.Buffer to cache these reference query to memory to reduce additional resource spend.
Reference link:
How to Improve Query Reference performance for large tables
Use of Table.Buffer in references
Also the official documents Power BI performance best practices
Hi @Anonymous
Its always difficult to try and see without knowing the network infastructer. First thing would be to remove any columns you are not using and try to clean up your model. Is it your visuals that are taking along time to refresh or your data into the query editor?
You could always look at the performance analyzer available from the view payne. there is also some documentation on getting best perfomance on Microsofts website.
Also as a thought with 2 million records it may be worth investing in a staging database.
Thanks
Dobby Libr3
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
117 | |
107 | |
69 | |
68 | |
43 |
User | Count |
---|---|
148 | |
104 | |
102 | |
89 | |
66 |