Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Bit of a can of worms question so firstly you'll want to step back and consider the whole data life cycle to make sure its fit for purpose.
So you say you have a database with 300 tables, only 100 of which need to be used in reporting. Is this database a production system with standard read/write operations occuring constantly?
Have you considered a data warehouse in the first instance, then making use of Views and Stored Procedures to provide a tighter querying process? Sometimes you might only want a simple model and going through Analysis Services could be overkill.
Analysis Services is great, i've been using it. The partitioning is worth considering, espeically if you have so much data. We have a data warehouse set up (Azure) and we make use of the Data Factory to call a Logic App which processes our Azure AS models. I then surface those models up in 7 different workspaces that have a variety of regionalised reports each with distinct localisations for the language and currencies of the managers in that region.
The database is for read-only purposes (in terms of our scope).
For now, we only need a sales reporting service, hence a data-warehouse seems a little bit overwhelming (cost and infrastructure).
We've been looking into Data Factory in order to get all the data into an Azure DB, therefore importing it into PowerBI or AAS.
Could I ask you how did you manage to work around connection between DataFactory and on-prem data? There's no way to get a static IP from ADF in order to grant access.