Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
I have a rather simple report i am using to show parts in our invnetory that need to be repaired but haven't been returned built. I have a seperate excel spreadsheet that i keep notes on for the invnetory so the everyone will know where i am with looking for the missing cores. Its very simple, just the material number, notes field and follow up date. The purpose of the follow up date is so that I can hide it from my list, and from my KPI card if its before the follow up date. I have a column in the table as follows that i use to filter out the follow up dates after today.
I would think, or at least assume, that the report, no matter the place you're viewing it from, is pulling from the same data. Though it may not be exactly the same scenario, it certainly raises the point that the two views may not be pulling from the same stored data. No matter the time elapsed, thank you for the insight into your situation, I may revisit this KPI to see if I can get it to work.
Been a while since you asked your question but I ran across this post while trying to solve my own problem, thought I'd mention it. I had a similar issue with a Card visualization, which turned out to be an issue for the entire page. This page revolved around average time to closure of helpdesk tickets. When designing and viewing the data in Power BI Desktop, no issues. Each ticket has a column for the time it was closed, so one of my main filters was to ignore any tickets where that value was blank (i.e. the ticket wasn't closed yet).
This dashboard was already in Azure and linked up to a promoted semantic model that was refreshing on a regular schedule. What I observed was when I would publish this report and then immediately refresh the dashboard in Azure, the numbers would look correct for a few seconds, but slowly started reverted to extremely large (small? 🙂 ) negative numbers. What I eventually found was that while I was connected into my data source on prem, those values being blank was fine, but when the semantic model was storing that data in Azure, it was converting the blank values to 12/30/1899 12:00:00 AM. So suddenly, I was pulling in all these tickets that had a -124 year closure time, causing my numbers to go wack. No idea how to fix those dates from filling in, but I do know how to filter out that date, which essentially fixed my issue.
This obviously isn't the same as what OP posted but wanted to throw it in here really quick in case someone else comes across this in the future. Check your semantic model in Azure and make sure it isn't transforming your data somehow.
User | Count |
---|---|
99 | |
86 | |
78 | |
75 | |
71 |
User | Count |
---|---|
112 | |
105 | |
96 | |
74 | |
66 |