I have data with a customer key, date, has_clicked, has_opened (flags), and I want to do a subquery to aggregate the data, but have dynamic date filters linked to it.
I can write the SUMMARIZE with fixed dates, I just don't know how to make it dynmaic and link it to a slicer so the elements in bold change as the slicer does
ALL ( 'SK_powerbi_summarize_test' ),
'SK_powerbi_summarize_test'[send_datetime] >= date(2017,1,1)
&& 'SK_powerbi_summarize_test'[send_datetime] <= date(2018,1,1)
, "has_opened", SUM('SK_powerbi_summarize_test'[has_opened] )
, "has_clicked", SUM('SK_powerbi_summarize_test'[has_clicked] )
Solved! Go to Solution.
Thank you. After quite a bit of reading and research I had started to come to the conclusion that what I was trying to do isn't possible in the way I was trying to do it. Your post takes me from a suspicion I couldn't do it, to being sure I needed another approach, and it confirms the reason why (only measures are dynamic and linked to slicers) - so it's a great help that I stop travelling down a dead end road.
An alternative approach I'm going to try is to use the summarize functionality to create a table which is just unique list of all entity keys (customer identifiers), and this will regenerate and get updated as new data is loaded. Then I use measures to calcualte the totals I need, and these measures will be the dynamic element which respond to slicers changing. All I need is a filter on any visuals to exclude customer keys where the measure is 0.
This feels like an approach that could work. Do you agree?
OK here is my solution - hold on to your hats it's tricky
1) Create a summary table of the customer key (pi_entity_fk)
entity_summary = GROUPBY('SK_powerbi_summarize_test', 'SK_powerbi_summarize_test'[pi_entity_fk],
"Count_Records", COUNTX( CURRENTGROUP (),'SK_powerbi_summarize_test'[has_opened]))
2) Creat a relationship between the entity_summary table and the orginal table it was create from SK_powerbi_summarize_test joining on the customer key pi_entity_fk
3) Create 2 measures - one to count the values and the other to band them to use in charts. I expected it to stop here and then use these measure in a chart, but you can't do that so we have a couple more steps
dynamic_has_opened = CALCULATE(sum(SK_powerbi_summarize_test[has_opened]))
has_opend_band2 = if([dynamic_has_opened] > 50, "Over 50", if([dynamic_has_opened] > 20, "21-50", if( [dynamic_has_opened] > 10, "11-20", if( [dynamic_has_opened] > 0, "1-10", "None"))))
4) Create a table containing the band descriptions in the measure,
5) In this new summary table create a measure to count the occurances of each banded value in then entity_summary table
VAR MeasureValue = SELECTEDVALUE('open_band_summary'[open_band])
RETURN COUNTX(FILTER('entity_summary', [has_opend_band2] = MeasureValue), [has_opend_band2])
Now you can chart the summarised data and a date slider linked to the orignal data table SK_powerbi_summarize_test date field will update the chart
It's glad that you solved your problem. You could accept the replies making sense as solution to your question so that people who may have the same question can get the solution directly.
While it solves the problem, the performance of measure on tables with 3 million records is SLOW in the extreme. SQL queries to do the summarisation take less than a second, but Power BI takes almost a minute to calculate it everytime I move the slicer and often runs out of memory ... a VBA driven chart with stored procedures and a pass through would be almost 100 times faster.
Although I've just seen an R ODBC clip on YouTube which I think does this without using any of these PowerBI DAX data wrangling features ...