Im trying to evaluate each column value agaisnt the measure but cant seem to figure out why it is struggling.
A very simple 1 column table (Test);
Measure = STDEV.P(Test[Number]) - which gives a Standard deviation value of 2.87
I have now a calculated column that checks to see iff each of the values in the number column is greater than or less than the Standard Deviation calculated in the measure using this formula;
LogicalTest = IF(Test[Number]<[Measure], "Less", "Greater")
But the resulting table looks like this;
Why is it evaluating each row value as "Greater"? Surely values 1 and 2 should show "Less"
See the PBI file below:
Solved! Go to Solution.
Measures transform row context to Filter context
So your STDEV.P gets calculated for each individual row in above formula
LogicalTest = IF(Test[Number]<STDEV.P(Test[Number]), "Less", "Greater")
Thanks for that.
The only issue I have is that the Standard deviation needs to be stored as a measure. And from there it needs to evaluate each row to that measure value. Can this be achieved anyhow?