I'm trying to scale a PBI solution for student data. We have LocationCodes that identify both in-residence and online campuses--for my current reporting I'm only interested in the in-res LocationCodes, but I plan to load the online campus information just in case.
I have six or seven queries that pull a LocationCode from a SQLS DB, and currently they all use a long WHERE clause in the source step with each LocationCode I want. It would be cumbersome to have to update all of these queries any time I want to adjust the locations to pull. Likewise, I don't want to have to use text filters or any other kind of applied step in every query that would require me to update them one at a time. What I'd like to do is use a single function/measure/calculated table (not sure what it would be considered at this point) identifying all of the LocationCodes I'm interested in, and plug that somewhere into the queries, such that the tables outputted by the function/measure/calculated table include only the records with the LocationCodes I've identified in this one place.
Does that make sense? I can provide some mockups if needed.
Maybe I have understand your requirement. You want to import the data table which contains LocationCodes from web and from MS SQL Database, and you want to import them into PBI once then shape(remove) the data in the tables using features such as calculate table in power bi, right?
Firstly, the measure/calculate table is used in DAX which are not supported when loading data. In addtion, you create another table with one column which contains full name of locationcode then create relationship between these tables. Lastly, you can create a map visual on the created table.