i have a sql view with size of 16000MB, i have abput 4 duplicates of this view in the .pbix file in order to deal with complex data & business.
is it too big fot power BI to deal with? my computer crushes again and again when trying to apply query.
(i have pro User)
what are my options?
thanks a lot
** comments after checking
i have another report with table size of 337 in Azure SQL Datawarehouse and importing has no problem.
the view with 16000 MB is on SQL server, but has complex M functions.
another comment- my computer crushes even though the query is only select * from table
(16000 MB and about ~90M rows)
i suspect that this even crushes the server
If your computer crushes that might be sign of a bigger issue. Might need to check the system event logs for that.
As for how much data you can fit in Power BI Dektop, you still are long way to go. I've managed to load 4 bn rows (232-2 to be precise) into Power BI.
As to your particular case run though the following checklist:
hi @hugoberry thank you very much for you reply
can you elaborte more about the checklist? i don't know these things you mentioned.
what is that CommandTimeout? where can I defined it? also don't know what is buffering the data
thanks a lot again
I don't have too much information about your setup so I assumed that in your query you might be using the following Access Data Functions
Table.Buffer Buffers a table into memory, isolating it from external changes during evaluation.
Table.Combine Returns a table that is the result of merging a list of tables. The tables must all have the same row type structure.
If you don't have too many trade secrets in your M queries to get more precise suggestions.
@hugoberry thank you
i added this timeout function and i'm getting this error message again (as above)..
do you know maybe if the problem is on my server? and not on the PBI side?
because it crushed on the 'evaluating' status and not even in the middle of loading the rows
Currently, even without solid document about dataset size limitation for reference, as we tested, big amount of data 90M rows data will have really poor performance in Query Editor. So in this scenario, we suggest you use Direct Query mode.