Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
I am trying to Process large volume of data (61.3 GB) in an Analysis Service Table. The data has 300 millions of records and 21 columns. Whenever , I try to process the data using SQL server Analysis service , I am getting the following error :
Failed to save modifications to the server. Error returned: 'The number of items in the list is too large. Buffered lists can support up to 2147483647 items and streamed lists can support up to 4503599627370496 items.. The exception was raised by the IDbCommand interface.
I need to process this volume of data. What are the processes to push this amount of data into an Analysis Service table. Please help
Hi @rakesmanna,
Did you connect to SQL Server Analysis Service with Direct Query mode or Import mode?
Regards,
Yuliana Gu
Thanks @v-yulgu-msft
Actually I am importing data from Azure Data lake Storage Gen 1 and only Import Mode is available there. To overcome the problem I partitioned the table in smaller chunks so that no partiotion conatains more than 10 million records. Then I create a smaller table with same structure but only records was there. Import this smaller table first. Then use SQL Server Analysis Service and Process all the partition for original data .
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
110 | |
99 | |
80 | |
64 | |
57 |
User | Count |
---|---|
145 | |
110 | |
91 | |
84 | |
66 |