cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Highlighted
MP_123 Member
Member

Data Capacity

hi,

i have a sql view with size of 16000MB, i have abput 4 duplicates of this view in the .pbix file in order to deal with complex data & business.

is it too big fot power BI to deal with? my computer crushes again and again when trying to apply query.

(i have pro User)

what are my options?

thanks a lot

** comments after checking

 

i have another report with table size of 337 in Azure  SQL Datawarehouse and importing has no problem.

 

the view with 16000 MB is on SQL server, but has complex M functions.

 

another comment- my computer crushes even though the query is only select * from table

(16000 MB and about ~90M rows)

i suspect that this even crushes the server

errorsql.PNG

7 REPLIES 7
MP_123 Member
Member

Re: Data Capacity

Hi,

please can somoene tell me with whom I can get advice from?

 

thanks

hugoberry Member
Member

Re: Data Capacity

If your computer crushes that might be sign of a bigger issue. Might need to check the system event logs for that.

As for how much data you can fit in Power BI Dektop, you still are long way to go. I've managed to load 4 bn rows (232-2 to be precise) into Power BI.

 

https://querypower.com/2017/02/04/power-bi-desktop-limit/

 

As to your particular case run though the following checklist:

  • have you defined a CommandTimeout?
  • have you tried to Buffer the data? Very useful if you are doing lots of M processing in your queries.
  • have you tried splitting out the initial select query into chunks? I know that Power BI gives you very little control over the parallelism that you can achieve, but you can trick it with merging queries that wait on other queries to finish (Table.Combine)
  • check the trace logs from Power BI and also fire profiler on SQL Server side to get an idea on what is actually happening with that select * from table
MP_123 Member
Member

Re: Data Capacity

hi @hugoberry thank you very much for you reply

 

can you elaborte more about the checklist? i don't know these things you mentioned.

what is that CommandTimeout? where can I defined it? also don't know what is buffering the data

thanks a lot again

hugoberry Member
Member

Re: Data Capacity

I don't have too much information about your setup so I assumed that in your query you might be using the following Access Data Functions

  • Odbc.DataSource might be able to define Connection Timeout in the connection string
  • OleDb.DataSource no you Connection Timeout in the connection string
  • Sql.Database (most likely) Whcih has the following option: CommandTimeout : A duration which controls how long the server-side query is allowed to run before it is canceled. The default value is ten minutes

Table.Buffer Buffers a table into memory, isolating it from external changes during evaluation.

Table.Combine Returns a table that is the result of merging a list of tables. The tables must all have the same row type structure.

 

If you don't have too many trade secrets in your M queries to get more precise suggestions.

MP_123 Member
Member

Re: Data Capacity

@hugoberry thank you

i added this timeout function and i'm getting this error message again (as above)..

do you know maybe if the problem is on my server? and not on the PBI side?

because it crushed on the 'evaluating' status and not even in the middle of loading the rows

 

thanks

hugoberry Member
Member

Re: Data Capacity

Depending on your SQL view this could be the case. Can you change the query to be select top 10 * from table to see if it evaluates correctly? What are you getting in SQL Server profiler?

 

Moderator v-sihou-msft
Moderator

Re: Data Capacity

@MP_123

 

Currently, even without solid document about dataset size limitation for reference, as we tested, big amount of data 90M rows data will have really poor performance in Query Editor. So in this scenario, we suggest you use Direct Query mode.

 

Regards,