This question is for you who uses Power BI to handle a large number of tables.
I am working in a small but fast growing company that currently uses Power BI to create reports for our customers. At the moment a typical report consists of a handful of tables, mostly small lookups and a single data table with a few 10k's of rows. Already with about 40K rows I am having trouble executing DAX formulas on the dataset, mostly due to the rather large memory consumption of some functions, like f.inst. EARLIER. But what about the near future, when the typical report contains a ten or a hundred times more data?
What challenges or limitations do you face, and how do you handle them?
Is it technically and practically possible to use Power BI with 100's or 1000's of tables?
To clarify, if the answer is not a clear, unambigious "yes", then we will probably have to look for an alternative solution.