Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hello all!
Newer to devleoping Power BI data connectivity.
We have a SQL database that is running sales data at a granular level(POS).
Our Power Bi service(premium per user, app built) has a Table that references this data source and Table, no "advanced" sql query, just credentials and data manipulaiton after into Powerbi.
To optimize our POS database, we have auto archiving now enabled, with a new Database(same SQL instance) with all the same tables with older data.
My desire is to open up the table, add a "2nd source" and use a "where" in the sql query to date filter, Everything before 1/1/22 go to the archive, then everything after go to the live, append the two, then have the current power bi data manipulations occur to the appended data set.
I am having a hard time finding how to "add" this second source ot the same table. am I thinking about it incorrectly or am I not understanding/seeing the Power BI functions to accomplish this?
Any help would be greatly appreciated.
Solved! Go to Solution.
There are many ways to do this, here's how I would personally approach it:
1. Create two SQL views for the archival ("SQL-A") and fresh rows ("SQL-B") of your SQL source, respectively. As you said, these views are the same query with a different WHERE clause.
2. Create two Power BI dataflows ("DF-A" and "DF-B") ingesting these two views separately. Then schedule a refresh only on the one ingesting the fresh view (DF-B).
3. Create a third dataflow ("DF-C") that simply appends the two previous ones to each other as Linked Entities. The scheduled refresh on DF-B should automatically also refresh DF-C.
4. Substitute DF-C as the source used in your Power BI dataset as a replacement of the current SQL query.
5. Optional - orchestrate dataflow and dataset refreshes via Power Automate.
There are many ways to do this, here's how I would personally approach it:
1. Create two SQL views for the archival ("SQL-A") and fresh rows ("SQL-B") of your SQL source, respectively. As you said, these views are the same query with a different WHERE clause.
2. Create two Power BI dataflows ("DF-A" and "DF-B") ingesting these two views separately. Then schedule a refresh only on the one ingesting the fresh view (DF-B).
3. Create a third dataflow ("DF-C") that simply appends the two previous ones to each other as Linked Entities. The scheduled refresh on DF-B should automatically also refresh DF-C.
4. Substitute DF-C as the source used in your Power BI dataset as a replacement of the current SQL query.
5. Optional - orchestrate dataflow and dataset refreshes via Power Automate.
Thank you so much. This worked and was easier than anticipated.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.