Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
I'm hoping somebody can help. I have seen a lot of people trying to do something similar but every solution i have tried doesn't seem to work for me.
What I am trying to achieve:
I have a table which contains a list of capcity data for some vmware datastores. The data currently gets added in roughly every 5 minutes or so.
For each datastore name I would like to add a calculated column which finds the "Used" amount which is closes to the date shown in the "7days" column. If i can do this without the requirement for the 7days column then that would be ideal.
Example data:
Name | VMCount | Total | Free | Used | Percent | DateTime | LatestDate | IsLatest | 7days |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:30:06 | 14/04/2019 18:30:06 | 1 | 07/04/2019 18:30 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:25:05 | 14/04/2019 18:30:06 | 0 | 07/04/2019 18:25 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:20:06 | 14/04/2019 18:30:06 | 0 | 07/04/2019 18:20 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:15:05 | 14/04/2019 18:30:06 | 0 | 07/04/2019 18:15 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:10:05 | 14/04/2019 18:30:06 | 0 | 07/04/2019 18:10 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:05:06 | 14/04/2019 18:30:06 | 0 | 07/04/2019 18:05 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:00:07 | 14/04/2019 18:30:06 | 0 | 07/04/2019 18:00 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:55:05 | 14/04/2019 18:30:06 | 0 | 07/04/2019 17:55 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:50:06 | 14/04/2019 18:30:06 | 0 | 07/04/2019 17:50 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:45:05 | 14/04/2019 18:30:06 | 0 | 07/04/2019 17:45 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:40:06 | 14/04/2019 18:30:06 | 0 | 07/04/2019 17:40 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:35:06 | 14/04/2019 18:30:06 | 0 | 07/04/2019 17:35 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:30:07 | 14/04/2019 18:30:06 | 0 | 07/04/2019 17:30 |
This goes on for many days worth of data. I haven't included it all because the example would be huge.
Example desired output:
For this example lets assume that 7 days ago we had 7Gb used and then at -7days at 07/04/2019 17:40:06 we had only 6 GB used.
Name | VMCount | Total | Free | Used | Percent | DateTime | LatestDate | IsLatest | 7daysused |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:30:06 | 14/04/2019 18:30:06 | 1 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:25:05 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:20:06 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:15:05 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:10:05 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:05:06 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 18:00:07 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:55:05 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:50:06 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:45:05 | 14/04/2019 18:30:06 | 0 | 7 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:40:06 | 14/04/2019 18:30:06 | 0 | 6 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:35:06 | 14/04/2019 18:30:06 | 0 | 6 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:30:07 | 14/04/2019 18:30:06 | 0 | 6 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:25:05 | 14/04/2019 18:30:06 | 0 | 6 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:20:07 | 14/04/2019 18:30:06 | 0 | 6 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:15:06 | 14/04/2019 18:30:06 | 0 | 6 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:10:06 | 14/04/2019 18:30:06 | 0 | 6 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:05:06 | 14/04/2019 18:30:06 | 0 | 6 |
R-RG-STR02-LUN203-REPL-LOGS | 0 | 10 | 2 | 8 | 80 | 14/04/2019 17:00:07 | 14/04/2019 18:30:06 | 0 | 6 |
I hope i've explained that well, if anyone could help that would be appreciated
@Anonymous ,
You may try using RANKX.
https://community.powerbi.com/t5/Desktop/IF-AND-dax-Formula/m-p/293991#M129654
Hi,
Thanks for the reply. I tried to adapt the example you showed to my data, it seemed to crash my machine, i'm guessing due to the amount of data I have in there and the fact it is trying to run this on every row.
Code i tried is:
Column =
IF (
RANKX (
FILTER ( Tbldatastore, Tbldatastore[DateTime] < EARLIER ( Tbldatastore[7days] )
&& Tbldatastore[name] = EARLIER (Tbldatastore[Name]) ),
Tbldatastore[Index],
,
ASC
)
= 1,
Tbldatastore[Used],
0
)
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
112 | |
97 | |
85 | |
67 | |
59 |
User | Count |
---|---|
150 | |
120 | |
100 | |
87 | |
68 |