Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
rocky09
Solution Sage
Solution Sage

Working with Large Data

Just want to know, How you guys work with Large Data?

 

1. Do You completely imports all the tables from database like SQL, Oracle into PowerBI.

2. Do You create a new table with SQL query (combining different tables with different criteria) and import into PowerBI.

 

Please share your thoughts.

 

Thank you,

 

 

6 REPLIES 6
vanessafvg
Super User
Super User

Define large?  Ive been struggling a bit with direct query being very slow, so much so i was pulling my hair out.  it could be our network though, so ive taken to reducing my model to only what i need (which is what is best pracise anyway - i wanted to do that via sql code but i was unable to) - importing in has worked better.   

 

The issue I was having regarding not being able to write sql queries is that i log onto a a dev domain from my local, when i try to write sql queries as opposed to bringing in the whole table, i get user not authorised.  Still haven't worked out why that is happening.  I did then create views, however still found the direct query slow and it was using a lot of my memory too, even though i was expecting that to be on the server side.





If I took the time to answer your question and I came up with a solution, please mark my post as a solution and /or give kudos freely for the effort 🙂 Thank you!

Proud to be a Super User!




v-sihou-msft
Employee
Employee

@rocky09

 

I suggest you separate date into multiple views to import. Or use Direct Query instead of import mode.

 

Reference:
Data Import Best Practices in Power BI

 

Regards,

@v-sihou-msft

 

Thanks for the link. Most of the content helped me.

 

I am using individual queries to get the data from the server. But, sometimes, I am missing a common column in tables. So, I am not able to make relationships with it. Is there a way to create a custom common column ?

@rocky09

 

 I don't know what your common column looks like. You can add custom columns like index column.

 

Regards,

It depends.

 

If you have a very large data set but you need to work with only a subset of the data, then when you set up your filters in the Query Editor it will only import the subset.  Keep in mind that there are file size limits and workspace storage limits for your PBI service.

 

If you need to work with the entire data set, I would recommend using Direct Query (if the underlying source is compatible with DQ).  There are some limitations here with regards to DAX, but DQ will not import any data into the data model meaning size is not an issue.

I'd love to know how the import filtering works and how to optimize the query to minimize the import. I have a table with 60M records, which I need to filter on two different criteria, which then selects a few thousand rows which are actually part of the final imported table. However whenever I refresh the query, Power BI is importing about 16M records over ODBC from our CRM which literally takes hours. It's obvioulsy doing some filtering, because it doesn't grab the whole database, but it's pulling in way more than make it through the filters and into the table

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.