cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
merdogmus
Frequent Visitor

Calculating measurement on time base from cumulative data

Hello,

 

I have system that logs counter values of different sources. Which gives me that kind of table when I want or can access via SQL database. In the same database, I also have other unique "DescriptionTag"s. System works well on some sources but at some sources it cannot collect data every 15 minutes or missing 1/4 of a day. So when showing daily consumption it should be

 

IF([ THERE IS NO FIRST VALUE OF 02.03.2018];

THEN [CONSUMPTION OF 01.03.2018] = [LAST LOG OF 01.03.2018]-[FIRST LOG OF 01.03.2018]

ELSE [CONSUMPTION OF 01.03.2018] =  [FIRST LOG OF 02.03.2018]-[FIRST LOG OF 01.03.2018]

 

I want to show this tags on a graphic by dates (yearly, monthly, even hourly) by using measure but I cannot find any

 

Capture.PNG

 

Dummy data is below. If I could I would attach an excel file.

 

DescriptionTagHistoryDateHistoryValue
AE0000000101.03.2018 00:0084496431,61
AE0000000101.03.2018 00:1484496573,42
AE0000000101.03.2018 00:3084496706,8
AE0000000101.03.2018 00:4584496840,71
AE0000000101.03.2018 01:0084496976,87
AE0000000101.03.2018 01:1584497110,15
AE0000000101.03.2018 01:3084497237,87
AE0000000101.03.2018 01:4584497353,02
AE0000000101.03.2018 02:0084497467,28
AE0000000101.03.2018 02:1584497579,52
AE0000000101.03.2018 02:3084497695,24
AE0000000101.03.2018 02:4584497810,79
AE0000000101.03.2018 03:0084497927,9
AE0000000101.03.2018 03:1584498044,22
AE0000000101.03.2018 03:3084498160,88
AE0000000101.03.2018 03:4584498278,69
AE0000000101.03.2018 04:0084498397,85
AE0000000101.03.2018 04:1484498511,47
AE0000000101.03.2018 04:3084498624,51
AE0000000101.03.2018 04:4584498737,08
AE0000000101.03.2018 05:0084498849,43
AE0000000101.03.2018 05:1584498960,19
AE0000000101.03.2018 05:3084499069,18
AE0000000101.03.2018 05:4584499177,49
AE0000000101.03.2018 06:0084499287,02
AE0000000101.03.2018 06:1584499396,85
AE0000000101.03.2018 06:3084499507,44
AE0000000101.03.2018 06:4584499625,97
AE0000000101.03.2018 07:0084499751,83
AE0000000101.03.2018 07:1584499882,65
AE0000000101.03.2018 07:3084500071,88
AE0000000101.03.2018 07:4584500408,89
AE0000000101.03.2018 08:0084500852,77
AE0000000101.03.2018 08:1584501426,12
AE0000000101.03.2018 08:3084502060,62
AE0000000101.03.2018 08:4584502730,11
AE0000000101.03.2018 09:0084503384,53
AE0000000101.03.2018 09:1584504090,11
AE0000000101.03.2018 09:3084504835,35
AE0000000101.03.2018 09:4484505549,97
AE0000000101.03.2018 10:0084506155,54
AE0000000101.03.2018 10:1584506838,06
AE0000000101.03.2018 10:3084507556,32
AE0000000101.03.2018 10:4584508282,47
AE0000000101.03.2018 11:0084509025,28
AE0000000101.03.2018 11:1584509743,14
AE0000000101.03.2018 11:3084510449,5
AE0000000101.03.2018 11:4584511051
AE0000000101.03.2018 12:0084511600,09
AE0000000101.03.2018 12:1584512240,9
AE0000000101.03.2018 12:3084512915,19
AE0000000101.03.2018 12:4584513620
AE0000000101.03.2018 13:0084514322,16
AE0000000101.03.2018 13:1584515078,93
AE0000000101.03.2018 13:3084515842,41
AE0000000101.03.2018 13:4584516581,25
AE0000000101.03.2018 14:0084517201,65
AE0000000101.03.2018 14:1584517775,6
AE0000000101.03.2018 14:3084518509,48
AE0000000101.03.2018 14:4584519230,16
AE0000000101.03.2018 15:0084519961,39
AE0000000101.03.2018 15:1584520666,02
AE0000000101.03.2018 15:3084521285,93
AE0000000101.03.2018 15:4584521894,68
AE0000000101.03.2018 16:0084522369,86
AE0000000101.03.2018 16:1584522786,84
AE0000000101.03.2018 16:3084523207,4
AE0000000101.03.2018 16:4584523636,23
AE0000000101.03.2018 17:0084524067,39
AE0000000101.03.2018 17:1584524503,77
AE0000000101.03.2018 17:3084524940,86
AE0000000101.03.2018 17:4584525349,22
AE0000000101.03.2018 17:5984525704,31
AE0000000101.03.2018 18:1584526051,21
AE0000000101.03.2018 18:3084526464,66
AE0000000101.03.2018 18:4484526848,44
AE0000000101.03.2018 19:0084527219,36
AE0000000101.03.2018 19:1584527592,96
AE0000000101.03.2018 19:3084527948,76
AE0000000101.03.2018 19:4584528242,36
AE0000000101.03.2018 19:5984528528,49
AE0000000101.03.2018 20:1584528819,31
AE0000000101.03.2018 20:3084529133,17
AE0000000101.03.2018 20:4584529515,68
AE0000000101.03.2018 21:0084529902,55
AE0000000101.03.2018 21:1584530268,47
AE0000000101.03.2018 21:2984530618,6
AE0000000101.03.2018 21:4484530981,74
AE0000000101.03.2018 22:0084531308,74
AE0000000101.03.2018 22:1584531650,72
AE0000000101.03.2018 22:3084532020,39
AE0000000101.03.2018 22:4584532389,54
AE0000000101.03.2018 23:0084532737,28
AE0000000101.03.2018 23:1584533072,4
AE0000000101.03.2018 23:3084533370,33
AE0000000101.03.2018 23:4584533599,38
AE0000000102.03.2018 00:0084533820,45
AE0000000102.03.2018 00:1584534002,35
AE0000000102.03.2018 00:2984534185,24
AE0000000102.03.2018 00:4584534335,04
AE0000000102.03.2018 01:0084534433,97
AE0000000102.03.2018 01:1584534531,49
AE0000000102.03.2018 01:3084534629,5
AE0000000102.03.2018 01:4584534727,85
AE0000000102.03.2018 02:0084534826,5
AE0000000102.03.2018 02:1584534925,88
AE0000000102.03.2018 02:3084535026,34
AE0000000102.03.2018 02:4584535126,31
AE0000000102.03.2018 03:0084535224,43
AE0000000102.03.2018 03:1584535321,33
AE0000000102.03.2018 03:3084535419
AE0000000102.03.2018 03:4584535516,52
AE0000000102.03.2018 04:0084535613,33
AE0000000102.03.2018 04:1584535708,8
1 ACCEPTED SOLUTION


Also I don't want to do this with min or max of HistoryValue. I want to calculate: [first HistoryValue of the day]-[first HistoryValue of next day] for daily view.


@merdogmus,

Create the measure in your table.

Measure = FIRSTNONBLANK(Tuketim[HistoryValue],1)

Create a new table using DAX below.

Table 3 = SUMMARIZE(Tuketim,Tuketim[DescriptionTag],Tuketim[Date],"Historyvalue",[Measure])


Create the following columns in the new table and check if you get expected result.

INDEX = RANKX('Table 3','Table 3'[Date],,ASC)
Column = 'Table 3'[Historyvalue]- CALCULATE(FIRSTNONBLANK('Table 3'[Historyvalue],1),FILTER('Table 3','Table 3'[DescriptionTag]=EARLIER('Table 3'[DescriptionTag]) &&'Table 3'[INDEX]=EARLIER('Table 3'[INDEX])+1))



Regards,
Lydia

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

6 REPLIES 6
merdogmus
Frequent Visitor

OK. Here the partial solution I have.

 

I made a measure as:

 

Tuketim = MAX(Tuketim[HistoryValue])-MIN(Tuketim[HistoryValue])

 

So I got the chart like this:

Capture.PNG

 

But the problem is when I do this I am calculating:

[HistoryValue at 23:45:00]-[HistoryValue at 00:00:00] for daily basis or monthly basis. the consumption between 23:45 and 00:00 of next day is gone.

 

Also I don't want to do this with min or max of HistoryValue. I want to calculate: [first HistoryValue of the day]-[first HistoryValue of next day] for daily view. For monthly or yearly view the situation must change.

 

With living fresh data I am having ConsumptionOfToday = [HistoryValue of today now] - [HistoryValue of today 00:00] which is I want to accomplish. For monthly view, formula is also working well for the month of today.


Also I don't want to do this with min or max of HistoryValue. I want to calculate: [first HistoryValue of the day]-[first HistoryValue of next day] for daily view.


@merdogmus,

Create the measure in your table.

Measure = FIRSTNONBLANK(Tuketim[HistoryValue],1)

Create a new table using DAX below.

Table 3 = SUMMARIZE(Tuketim,Tuketim[DescriptionTag],Tuketim[Date],"Historyvalue",[Measure])


Create the following columns in the new table and check if you get expected result.

INDEX = RANKX('Table 3','Table 3'[Date],,ASC)
Column = 'Table 3'[Historyvalue]- CALCULATE(FIRSTNONBLANK('Table 3'[Historyvalue],1),FILTER('Table 3','Table 3'[DescriptionTag]=EARLIER('Table 3'[DescriptionTag]) &&'Table 3'[INDEX]=EARLIER('Table 3'[INDEX])+1))



Regards,
Lydia

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

Thanks it worked @v-yuezhe-msft

 

But the last day data is wrong. Still, it is working. I can work on it from here

solution.PNG

v-yuezhe-msft
Microsoft
Microsoft

@merdogmus,

Could you please explain a lot about your expected result based on  the dummy data?

Do you want to get the following result for specific DescriptionTag in the condition that there is no HistoryValue in 02.03.2018 00:00?

[HistoryValue OF 01.03.2018 23:45]-[HistoryValue OF 01.03.2018 00:00]

If you has record for 02.03.2018 00:00, do you want to get the following result?
[HistoryValue OF 02.03.2018 00:00]-[HistoryValue OF 01.03.2018 00:00]

In addition, what fields do you want to drag in the column chart other then date field?

Regards,
Lydia

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

@v-yuezhe-msft

I want to make graphics like below.

 

If I can manage to get the logic of formula, I will make some more graphics like these. We are currently making these charts manually on excel.

Capture.PNG

 

We have lots of counters like AE00000001 (like more than 200).

 

The daily consumption is calculated as example: [HistoryValue OF 02.03.2018 00:00]-[HistoryValue OF 01.03.2018 00:00]

If there is data missing for somehow, formula will be:  [HistoryValue OF first data of the next day]-[HistoryValue OF first data of the day]

@merdogmus,

The logic you post is a little bit different from the original logic.

Take 02.03.2018 and 01.03.2018 for example, if data is missing at 00:00 of 02.03.2018, you would like to get [HistoryValue OF first data of 02.03.2018]-[HistoryValue OF first data of 01.03.2018]. If data exists at 00:00 of 02.03.2018, you want to get [HistoryValue OF 02.03.2018 00:00]-[HistoryValue OF 01.03.2018 00:00]?

Regards,
Lydia

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
PBI User Groups

Welcome to the User Group Public Preview

Check out new user group experience and if you are a leader please create your group!

MBAS on Demand

Microsoft Business Applications Summit sessions

On-demand access to all the great content presented by the product teams and community members! #MSBizAppsSummit #CommunityRocks

MBAS Attendee Badge

Claim Your Badge & Digital Swag!

Check out how to claim yours today!

Top Solution Authors
Top Kudoed Authors