Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hey Guys,
I have a source databricks and I have 2 tables that consist of more than 62,312,943 rows each, so which mode of import would be preferable for me in this case
also in the same file, there are other 10 small tables with 1-2M rows each
Regards
Solved! Go to Solution.
@NimaiAhluwalia , I vote for the composite Model. 😀
If that 60M row table is very thin(Not many columns) or you are on Premium, you can use import mode.
@NimaiAhluwalia , I vote for the composite Model. 😀
If that 60M row table is very thin(Not many columns) or you are on Premium, you can use import mode.
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.
User | Count |
---|---|
84 | |
84 | |
66 | |
62 | |
62 |
User | Count |
---|---|
199 | |
120 | |
110 | |
79 | |
69 |