I need some suggestions on how do implement a large data set for use in both Excel and Power BI.
I've currently got a report with the dataset in Power BI which is great but when you try to analyze in Excel, it has the quorking limitations of being able to drag any column into your values for implicit measures in pivot tables.
Most of my users do their work in Excel so I had thought of creating the datamodel in Power BI and accessing it in Excel but I have the same issue of being able to drag any column into the values table. I searched and found that this is by design so I needed to try something else.
So, I created the model in Excel using Power Query. This way I can upload the excel working book into Power BI and still use the datamodel created in the workbook as a datasource. I'm hoping I can access it as well in Power BI Desktop as long as it's uploaded into the Power BI Portal.
But, here's my dilema,
1. The data needs to be refreshed. - I'm assuming it still can be if it's in Power BI but if it's on One drive, can it still be automatically refreshed?
2. RLS. I need to have this which I have working in Power BI, but how do I do the same thing using Excel?
I'm so confused on where the data source master should be created in order to not have to create multiple data sources of the same set of data and then having to worry about updates to the query of that data in multiple places