Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi all
I configured incremental refresh on a report based on a transactions table with millions of records for which I want to:
- Keep records of the last (full) year
- Refresh data of the last 3 months.
I'm using an on premise gateway and Oracle database. PowerBI Pro license.
The setup of the incremental refresh has been done and tested successfully on a small portion of the data but when applying it to the full dataset I always get this error message after 2 hours, during the very first upload: Before the data import for finished, its data source timed out. Double-check whether that data source can process import queries, and if it can, try again.
Is there any ways to bypass this issue? A colleague suggested me to work with parameters on app.powebi.com in the page were managing the dataset but I see this message: Parameters haven't been defined for this dataset yet. If you want to set parameters, use the Query Editor. This is very strange because the I configured the parameters in the desktop application.
Can you please help me?
Thank you in advance.
Francesca
Hi, @fbottazzoli
While the data type of the parameters must be date/time, it's possible to convert them to match the requirements of the datasource. For example, the following Power Query function converts a date/time value to resemble an integer surrogate key of the form yyyymmdd.
(x as datetime) => Date.Year(x)*10000 + Date.Month(x)*100 + Date.Day(x)
Queries can also be limited by the default timeout for the data source. Most relational sources allow overriding timeouts in the M expression. Please consider reducing the size or complexity of your dataset, or consider breaking the dataset into smaller pieces.
For further information, you may refer the document .
Best Regards
Allan
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @v-alq-msft
thank you for your suggestion. However, I don't really understand how it could help me since I was already able to configure the incremental refresh but wasn't able to fix the issue with time out occurring during the very first (massive) data upload. Please note that the time out seems to be due to the data transfer instead than db time out.
Anyway, I'm now trying to split my database in smaller ones but then I don't understand the advantage of this incremental refresh feature for the pro license.
Thank you,
Francesca
Hi @GilbertQ
Thank you for your answer. I already tried this option, setting it to 3600 (maybe too much 🙂 ). I'm trying again with a lower value (180 minutes) to see what happens. Please note that, based on the checks we made on the database side, the query execution in Oracle ends up within the two hours but the data transfer through the gateway takes longer.
If this test fails, I will manually partition the transactions table in smaller ones based on that and then merge them with union command, disabling the refresh for the older ones. I applied this workaround in other reports but I really hoped that incremental refresh could have helped.
Best regards,
Francesca
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.