cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
tvm_analyst
Frequent Visitor

Calculating remediated vulnerabilities

image.png

I have 2 fact tables containing vulnerability data, a finding table and a finding_date table.  The finding table is overwritten with the current findings after scans are run, the finding_date table is appended with the same data.  Vulnerabilities found in the finding_date table that are not in the current finding table can be considered remediated. I created a unique_key column by combining the asset_id + vulnerability_id.  For each unique_key, the remediation_date would be the LASTDATE() + 1 for all unique_keys that are not found in the finding table.

 

What I would like to do is create a calculated table from these 2 tables with the following columns: asset_id, vulnerability_id, (or unique_key), first_found_date, vulnerability_instances, remediation_date (if available).  I can then add calculated columns for measuring SLA's and vulnerability status.

 

 

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted
Super User III
Super User III

Re: Calculating remediated vulnerabilities

hi @tvm_analyst 

you could try something like this:

new table =
ADDCOLUMNS (
    SUMMARIZE (
        'finding_date',
        'finding_date'[asset_id],
        'finding_date'[vulnerability_id]
    ),
    "firstFoundDate", CALCULATE ( MIN ( 'finding_date'[date] ) ),
 "lastFoundDate", CALCULATE ( Max ( 'finding_date'[date] ) ) )

then, assuming that asset_id and vulnerability_id is unique in the finding-table, create a new column in the new table

isSolvedFlag =
IF (
    ISBLANK (
        LOOKUPVALUE (
            'finding'[date],
            'finding'[asset_id], 'new table'[asset_id],
            'finding'[vulnerability_id], 'new table'[vulnerability_id]
        )
    ),
    1,
    0
)

and then you should be on you way

 

View solution in original post

10 REPLIES 10
Highlighted
Super User III
Super User III

Re: Calculating remediated vulnerabilities

hi @tvm_analyst 

you could try something like this:

new table =
ADDCOLUMNS (
    SUMMARIZE (
        'finding_date',
        'finding_date'[asset_id],
        'finding_date'[vulnerability_id]
    ),
    "firstFoundDate", CALCULATE ( MIN ( 'finding_date'[date] ) ),
 "lastFoundDate", CALCULATE ( Max ( 'finding_date'[date] ) ) )

then, assuming that asset_id and vulnerability_id is unique in the finding-table, create a new column in the new table

isSolvedFlag =
IF (
    ISBLANK (
        LOOKUPVALUE (
            'finding'[date],
            'finding'[asset_id], 'new table'[asset_id],
            'finding'[vulnerability_id], 'new table'[vulnerability_id]
        )
    ),
    1,
    0
)

and then you should be on you way

 

View solution in original post

tvm_analyst
Frequent Visitor

Re: Calculating remediated vulnerabilities

Thank you for the response @sturlaws !  However this would assign the min date found in the entire table to all the vulnerabilities.   I would like to do something like a GROUPBY(unique_key) and find the MIN/MAX value within each group... if that makes sense.Screen Shot 2019-07-31 at 10.30.50 AM.png

 

Super User III
Super User III

Re: Calculating remediated vulnerabilities

@tvm_analyst, since you have not made any sample data available, I don't know excatly how your data looks, so the code should be consider more as guidelines for you to continue working on to get your desired result

Super User III
Super User III

Re: Calculating remediated vulnerabilities

if you care to upload some sample data I will take a look at it 🙂

tvm_analyst
Frequent Visitor

Re: Calculating remediated vulnerabilities

Sure, so below is some sample data. This would assume today is Jan 31st.

(There is a one-to-many relationship between the 2 fact tables, based on the unique_key, if that matters.)

 

fact_asset_vulnerability_finding_date 

date_writtenasset_idvulnerability_idunique_keyfirst_found_date
1/2/192111134562111-134561/2/19
1/3/192111134562111-134561/2/19
1/4/192111134562111-134561/2/19
1/5/192111134562111-134561/2/19
1/6/192111134562111-134561/2/19
1/7/192111134562111-134561/2/19
1/8/192111134562111-134561/2/19
1/9/192111134562111-134561/2/19
1/10/192111134562111-134561/2/19
1/11/192111134562111-134561/2/19
1/12/192111134562111-134561/2/19
1/13/192111134562111-134561/2/19
1/14/192111134562111-134561/2/19
1/15/192111134562111-134561/2/19
1/16/192111134562111-134561/2/19
1/17/192112134572112-134571/17/19
1/18/192112134572112-134571/17/19
1/19/192112134572112-134571/17/19
1/20/192112134572112-134571/17/19
1/21/192112134572112-134571/17/19
1/22/192112134572112-134571/17/19
1/23/192112134572112-134571/17/19
1/24/192112134572112-134571/17/19
1/25/192112134572112-134571/17/19
1/26/192112134572112-134571/17/19
1/27/192112134572112-134571/17/19
1/28/192112134572112-134571/17/19
1/29/192112134572112-134571/17/19
1/30/192112134572112-134571/17/19

 

fact_asset_vulnerability_finding

asset_idvulnerability_idunique_keyfirst_found_date
2112134572112-134571/17/19

 

What I am looking to get as output: (remediation_date = MAX(date_written) + 1)

unique_keyfirst_found_dateremediation_date
2111-134561/2/191/17/19
2112-134571/17/19 
Super User III
Super User III

Re: Calculating remediated vulnerabilities

Hm, I did pretty much exactly like I described and got this.

tvm_analyst
Frequent Visitor

Re: Calculating remediated vulnerabilities

For some reason, adding .[Date] to the input for MIN/MAX was causing my issue.  Removed it and everything seems to be working as intended.  Thank you @sturlaws !

tvm_analyst
Frequent Visitor

Re: Calculating remediated vulnerabilities

One last question.  How would I write a null/BLANK() for remediation_date for unique_keys that are present in the finding table.  Here's what I've got so far...

 

summarized_finding_date = 
ADDCOLUMNS(
	SUMMARIZE(
		'public fact_asset_vulnerability_finding_date',
		'public fact_asset_vulnerability_finding_date'[asset_id],
		'public fact_asset_vulnerability_finding_date'[vulnerability_id],
		'public dim_vulnerability'[sla_target],
		'public fact_asset_vulnerability_finding_date'[unique_key]
	),
	"first_found_date", FIRSTDATE('public fact_asset_vulnerability_finding_date'[day]),
	"remediation_date", LASTDATE('public fact_asset_vulnerability_finding_date'[day]) + 1
)

 

Super User III
Super User III

Re: Calculating remediated vulnerabilities

How about first creating a column in the finding-table to identify unique ids which are not present in finding_date-table:

DetectedFlag =
IF (
    COUNTROWS (
        FILTER (
            fact_asset_vulnerability_finding_date;
            fact_asset_vulnerability_finding_date[unique_key]
                = CALCULATE ( VALUES ( fact_asset_vulnerability_finding[unique_key] ) )
        )
    ) >= 1;
    0;
    1
)


then use this flag, together with a union query to add these rows to the Assets-table-query

Assetts = 
union(
ADDCOLUMNS(
    SUMMARIZE(
        fact_asset_vulnerability_finding_date;
        fact_asset_vulnerability_finding_date[asset_id];
        fact_asset_vulnerability_finding_date[vulnerability_id];
        fact_asset_vulnerability_finding_date[unique_key]
    );
    "First found date";CALCULATE(min(fact_asset_vulnerability_finding_date[first_found_date]));
    "Last date written";CALCULATE(max(fact_asset_vulnerability_finding_date[date_written])))
;
ADDCOLUMNS(    
    SUMMARIZE(
        FILTER(fact_asset_vulnerability_finding;fact_asset_vulnerability_finding[DetectedFlag]=1);
        fact_asset_vulnerability_finding[asset_id];
        fact_asset_vulnerability_finding[vulnerability_id];
        fact_asset_vulnerability_finding[unique_key]
    );
    "First found date";CALCULATE(min(fact_asset_vulnerability_finding[first_found_date]));
    "Last date written";BLANK()))

Alternatively, you can try to do this in power query during the import of the data

Helpful resources

Announcements
New Ranks Launched March 24th!

New Ranks Launched March 24th!

The time has come: We are finally able to share more details on the brand-new ranks coming to the Power BI Community!

‘Better Together’ Contest Finalists Announced!

‘Better Together’ Contest Finalists Announced!

Congrats to the finalists of our ‘Better Together’-themed T-shirt design contest! Click for the top entries.

Arun 'Triple A' Event Video, Q&A, and Slides

Arun 'Triple A' Event Video, Q&A, and Slides

Missed the Arun 'Triple A' event or want to revisit it? We've got you covered! Check out the video, Q&A, and slides now.

Join THE global Microsoft Power Platform event series.

Join THE global Power Platform event series.

Attend for two days of expert-led learning and innovation on topics like AI and Analytics, powered by Dynamic Communities.

Community Summit North America

Community Summit North America

Innovate, Collaborate, Grow. The top training and networking event across the globe for Microsoft Business Applications

Top Solution Authors
Top Kudoed Authors