Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi
This is related to https://community.powerbi.com/t5/Desktop/Filter-table-based-on-less-than-and-greater-than-date-value...
This has been impemented but I'm experiencing performance issues when dragging a field into my tablix visualisation
My data model:
The table 'Claim Amounts' has approx. 1.5m rows
I have a measure that sets a value based on the value selected in the Binder Dates[Date] slicer:
The problem I'm experiencing is that when I try to include Policy Reference in the below visual and apply a filter the query just times out. I guess it's because I'm trying to apply a filter on 1.5m rows
Is there another solution to this? This works just fine except for the performance issue
Solved! Go to Solution.
I'd consider expanding the table with the dates so you have a row per day for each item:
Have a look at:
It takes a basic table with From and To dates and then in power query:
1) Adds a Number of Days custom column:
Number.From( [Date To] - [Date From] ) + 1
2) Sets it's data type as whole number.
3) Adds a custom column to contain a list of dates:
List.Dates([Date From], [Number of Days], #duration(1, 0, 0, 0))
4) Expands the list to new rows.
5) Removed other columns.
6) Set date column as a date.
I think you can use this to massively increase the speed of the filter on that table.
If you're then still having performance issues the only other thing I can think is to move the filter directly in DAX by getting VALUES ( 'Binding Authority'[UMR] ) and using TREATAS within calculate to move it directly over to your big table.
@brian0782 can you rewrite this IF statement by storing the calculations in a variable and using those variables might improve the performance.
Hi, I tried this but didn't make much difference to performance:
You could try removing the bridge table and replacing it with a direct many many relationship with Binding Authority set to filter Claim Amounts.
My understanding is the engine should be more optimised for that.
Failing that you need to change the grain of the binding authority so each row represents a day. That makes the filter much simpler. Can share some code later. Sounds odd that adding more rows will speed it up but it massively will.
Hi, I did have this as a many to many but had to change it to get a calculation working. I think when I had this has a direct many to many I was still experiencing performance issues
I'd consider expanding the table with the dates so you have a row per day for each item:
Have a look at:
It takes a basic table with From and To dates and then in power query:
1) Adds a Number of Days custom column:
Number.From( [Date To] - [Date From] ) + 1
2) Sets it's data type as whole number.
3) Adds a custom column to contain a list of dates:
List.Dates([Date From], [Number of Days], #duration(1, 0, 0, 0))
4) Expands the list to new rows.
5) Removed other columns.
6) Set date column as a date.
I think you can use this to massively increase the speed of the filter on that table.
If you're then still having performance issues the only other thing I can think is to move the filter directly in DAX by getting VALUES ( 'Binding Authority'[UMR] ) and using TREATAS within calculate to move it directly over to your big table.
So I have expanded out the dates and this has massively helped! Performance is hugely improved and no longer timing out.
I would never have thought adding extra rows would improve this.
The only drawback I would say is this would only work in a single select. I'm trying to think of a situation where a user might want to select multiple dates but at the moment this is working.
Many thanks
User | Count |
---|---|
87 | |
84 | |
70 | |
62 | |
56 |
User | Count |
---|---|
137 | |
110 | |
92 | |
84 | |
69 |