Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
yasirsd
Frequent Visitor

Repeated data should count as one following month and year

Dears, 

 

I need to count data only one time following month and year. at the moment it is adding all values and delivering wrong result. in Column production time it is showing same value for each event code description. which i dont want. please help.

 

 

PBi Question1.png

1 ACCEPTED SOLUTION

Hi @yasirsd ,

 

Try measure like this:

Measure =
CALCULATE (
    FIRSTNONBLANK ( 'Table'[Production Time], 1 ),
    ALLEXCEPT ( 'Table', 'Table'[year], 'Table'[month] )
)

 

Best Regards,
Liang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

10 REPLIES 10
amitchandak
Super User
Super User

@yasirsd , try like

sumx(summarize(Table,Table[Month year],"_1",distinctcount(Table[event code])),[_1])

 

Refer : https://community.powerbi.com/t5/Desktop/SUM-of-AVERAGE/td-p/197013

Tahreem24
Super User
Super User

@yasirsd ,

Please share some screen shot or sample data of your issue and that DAX which you used for this.

 

Don't forget to give thumbs up and accept this as a solution if it helped you!!!

Please take a quick glance at newly created dashboards : Restaurant Management Dashboard , HR Analytics Report , Hotel Management Report, Sales Analysis Report , Fortune 500 Companies Analysis , Revenue Tracking Dashboard

Dear,

 

Screenshot is updated.

 

Thanks

@yasirsd what you want the expected output to be? Also, can you share the data as a table instead of an image.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

YearMonthEvent CodeFrequencyHoursProduction TimeWaste
2019January370340280   1   1   176   175
2019January103797020   30   28   176  5 443
2019January370350610   2   0   176   185
2019January210300000   9   5   176  1 699
2019January103315180   3   1   176   231
2019January310331000   3   1   176   263
2019January370340000   1   1   176   175
2019January370730000   1   0   176   101
2019January370340120   7   1   176   587
2019January210330730   4   1   176   388
2019January103805050   10   8   176   0
2019January103315300   2   0   176   133
2019January310332000   5   2   176   500
2019January201301700   3   2   176   446
2019January103317990   6   5   176   918
2019January701300000   4   2   176   377
2019January103797010   1   0   176   58
2019January103316040   5   1   176   295
2019January701301000   20   9   176  2 383
2019January103805030   3   0   176   0
2019January201303000   1   1   176   74
2019January103317820   5   1   176   295
2019January701301040   1   0   176   39
2019January103805060   27   34   176   0
2019January103317000   5   1   176   351
2019January370350310   1   0   176   117
2019January370300000   32   11   176  3 984
2019January310330000   1   0   176   59
2019January103317290   1   0   176   89
2019January103317410   2   0   176   118
2019January370330620   8   2   176   817
2019January370309020   1   0   176   95
2019January370340240   4   1   176   297
2019January370350130   7   1   176   547
2019January103712310   2   0   176   172
2019January103310940   1   0   176   59
2019January370350110   3   1   176   400
2019January310332030   2   1   176   332
2019January703301000   1   0   176   23
2019January370345230   1   0   176   88
2019January103317070   21   5   176  1 588
2019January103316700   1   2   176   120
2019January370701130   9   3   176   756
2019January370340710   1   0   176   59
2019January210310000   4   2   176   536
2019January370750420   1   0   176   59
2019January103317810   20   5   176  1 637
2019January103807000   51   24   176  7 539
2019January210390000   5   2   176   438
2019January103399300   16   0   176   948
2019January310331030   1   0   176   100
2019January201706000   21   6   176  2 607
2019January103313000   0   1   176   59
2019January103301000   1   0   176   59
2019January103316260   13   2   176  2 089
2019January103797050   5   5   176  1 507
2019January370360110   3   1   176   297
2019January370360210   2   0   176   144
2019January103700000   2   0   176   0
2019January370340220   6   2   176   752
2019January103399010   1   0   176   0
2019January310300000   3   2   176   370
2019January301792010   8   2   176   853
2019January701301010   11   4   176  1 077
2019January801301010   1   3   176   660
2019January370340270   1   0   176   59
2019January103805010   21   3   176   0
2019January370381790   1   0   176   99
2019January103317420   3   1   176   238
2019February103315980   1   0   182   36
2019February103327130   1   4   182   481
2019February103797020   19   14   182  3 311
2019February210300000   2   0   182   202
2019February103315180   3   0   182   206
2019February370330320   1   0   182   59
2019February370350140   4   2   182   319
2019February310331000   5   2   182   546
2019February201301000   1   0   182   59
2019February370340120   3   1   182   203
2019February210330730   19   5   182  1 596
2019February103317140   1   1   182   164
2019February103805050   2   3   182   0
2019February370381510   2   0   182   118
2019February370330340   1   0   182   91
2019February103303110   2   2   182   256
2019February201305000   1   0   182   85
2019February201301700   5   0   182   315
2019February103317990   2   2   182   59
2019February370330430   1   1   182   189
2019February201300000   2   0   182   150
2019February103711120   1   0   182   47
2019February701300000   3   2   182   223
2019February103310020   4   1   182   275
2019February370350620   1   0   182   88
2019February370330330   4   8   182   673
2019February103312010   2   0   182   123
2019February103316040   8   1   182   710
2019February701301000   24   12   182  3 163
2019February103805030   1   0   182   0
2019February103317820   11   5   182   913
2019February103317460   2   0   182   116
2019February103805060   30   28   182   0
2019February103317000   2   1   182   406
2019February301300000   1   0   182   29
2019February103312160   1   0   182   166
2019February370350310   3   1   182   294
2019February370300000   2   0   182   145
2019February310330000   3   1   182   177
2019February370330920   2   0   182   161
2019February103316000   1   0   182   100
2019February370350000   1   0   182   121
2019February103710250   1   0   182   27
2019February103316760   8   4   182  1 099
2019February103317410   6   1   182   698

What i need is, for Jan,2019 it should only count production time 176. currently it is adding all jan2019 data and giving value in thousands. I have same data for multiple machines. some machine have jan,2019 production time zero. if i use minimum it gives me zero value. 

 

i have machine serial number to differentiate data between different machines. 

Hi @yasirsd ,

 

Try measure like this:

Measure =
CALCULATE (
    FIRSTNONBLANK ( 'Table'[Production Time], 1 ),
    ALLEXCEPT ( 'Table', 'Table'[year], 'Table'[month] )
)

 

Best Regards,
Liang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

@yasirsd try this measure

 

Measure = 
SUMX ( SUMMARIZE( Table, Table[Date], Table[Machine], "Max Production", MAX( Table[Production] ),
[Max Production]
)

 

I would  Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos whoever helped to solve your problem. It is a token of appreciation!



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

parry2k
Super User
Super User

@yasirsd change the aggregation to MIN or MAX instead of SUM, to change aggregation, click the arrow next column your drop on value section of the visual, and choose MIN/MAX aggregation.

 

if it doesn't work, read this post to get your answer quickly.

https://community.powerbi.com/t5/Community-Blog/How-to-Get-Your-Question-Answered-Quickly/ba-p/38490

 

I would  Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos whoever helped to solve your problem. It is a token of appreciation!



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

Hi,

 

Min or Max doesnt work as i have some months with 0 values. 

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.