We are using POWERBI in our organization. WE are having data around 400GB in our server. The issue is that it will take around a day to refresh all the dashboards which will be used to followup and monitor daily achievements. Is there any method to speedup the refresh process? Please provide your suggestions which are highly aprriciated by us.
To improve the performance of importing large dataset from Oracle Database to Power BI, the key is adjusting “FetchSize” parameter in the connection string. Unfortunately there is no way to do that in the native Oracle connector in Power BI.
Luckily, you can adjust the connection string in OleDb connector. In my case I got 5 times performance improvement just by setting “FetchSize” to 5000.
Check the link below.
If you are talking abou Report Server (on-premise)...
1) Which is you source of data? MSSQL database or some else? E.g. for some sources are slow (including Oracle), and is better load data first to MSSQL db.
2) check where is problem, if in serving data from source, network transfer, processing (columns commpresion) or saving to catalog. You optimize or speed up problematic part.
3) check if the problem is generic, or report specific, problem can be in model desing within report.
4) try better way for enterprise deploymet, for large models use dedicated SSAS Tabular mode, use incremental load and additional processing options and benefits, and use Power BI only as visualisation tools...
Connecting to Oracle DB for large tables is very complicated, and is hated from Oracle side.
Performance is very dependent on type and version of Oracle Client.
Oracle ODP.NET (which is used in "Oracle database" connector in Power BI) and ODBC connectors are very slow, independent on version, and are unusable for large tables.
Little better is it with Oracle OLEDB provider.
In default configuration are slow same as above, but with some registry (or connection string) settings of FetchSize, you can gain significant performance boost (about 3x to 5x depends on version) if change default value 100 to value 1000 or more (higher values bring only minimal boost, but make makes another troubles in case many parallely loading). It is big improvement, but best what you gain is still much slower then competition solution.
Another complication is version, Oracle OLEDB working good only in 12.1 version of client, but this version is now unsupported. In version 12.2 (and also 18.3) Oracle implemented memory bug, which for transfer of large table consume extreme large amount of memory (connector consume aprox 4GB of RAM, for import 400MB table). In version 19.3 Oracle implement another bug, also for large tables, when every table during transfer is rapidly slowing down (aprox to half during every 15minutes).
Currently best solution for large Oracle tables is use 3rd party connector (e.g. https://www.progress.com/odbc/oracle-database ) which is much better then Oracle OLEDB. Or use two step extract, first extract Oracle data to MSSQL using SSIS with Microsoft provided Oracle connector https://www.microsoft.com/en-us/download/details.aspx?id=58228 ) and then load to Power BI from MSSQL (although in two step, is still faster then Oracle OLEDB).
Thank you for extending your help to us. Currently we are acrrying out DB to DB data transfer. Any suggestions on how to improve the dta transfer speed from DB to DB??
Can you please explain a bit more?
Are you trying to refresh data from Oracle to Power BI? or you are trying to data refresh first from Oracle to SQL Server then to Power BI?
Is it necessary to import all the data (400GB) in Power BI? Since data refresh takes too much time you can try out direct query?
Watch the playback when Priya Sathy and Charles Webb discuss Datamarts! Kelly also shares Power BI Community updates.
Watch Session 24 of Ted's Dev Camp along with past sessions!
The first Microsoft-sponsored Power Platform Conference is coming in September. 100+ speakers, 150+ sessions, and what's new and next for Power Platform.