Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
I do have a huge dataset that has aggregated data that has to be disaggreated to be able to present it on different types of charts. The issue with that I got that I need to create a table for each question to be able to unpivot the columns so I has to unpviot the columns on the original table, it will destort the data strcuture. Therefore, is there a quick solution to this issue, or if I created a table for each question, will this effect the perfromance of loading reports or publishing the data. One more concern is the amount of tables that I might create for each question as we do have about 500 columns. Is there a recommended way to handle this huge amount of data?
Thanks, -H
In this scenario, since you have too many columns for each row in your dataset, I suggest you use ETL tools like SSIS to umpivot the source table, then import the destination table into Power BI. See: How To Use the Unpivot Transform in SSIS.
Another option is separating the dataset into multiple datasets, each one may contains 50 columns. Then unpivot each dataset and use "append query" in Quert Editor to combine the datasets together.
Also please refer to: Data Import Best Practices in Power BI
Regards,
Thanks both for replying to my question, that is appreciated.
I wanted to ask few things about the limitations of using large dataset in power BI, so if it is that limited, I have a dataset with 500 columns, will it be okay according to your expereince to separate these columns into different datasets but not using power BI, but by using the excel and then I can add it as a source to my dashboard. Yes our data is a bit huge, and we tried many BI tools, and we founf power BI good as it serves our requirements. But the only concern now and that might get in our way to carry on using power and looking for another platform is the dataset size limitation.
I understand you have survey data, presented in rows (UserId, Question, Response) and you want to be able to answer questions like "how many customers answered A to question 1 and answered B to to question 2?"
Creating new columns is the way to go, if you want to use an un arbitrary number of filters.
Alternatively, Marco Russo and Alberto Ferrari have proposed a pattern that only allows the user to apply a predefined number of filters:
http://www.daxpatterns.com/survey/
This drawback comes with the advantage that no unpivoting is required, resulting in less memory being required (due to fewer columns being present in the data model).
See if it works for you.
Well for me it worked perfectly, what I did is I splitted the data sets into separate sheets and it worked without any huge slowness, I thought it is not gonna work but apparently, it did 🙂
In Power BI, we have the 1 GB limitation per dataset. So if your dataset will exceed this size, please separate into multiple datasets.
Regards,
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
114 | |
99 | |
82 | |
70 | |
60 |
User | Count |
---|---|
149 | |
114 | |
107 | |
89 | |
67 |