Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
In Part 1 of this blog, we extracted Audit Log data from Office 365 using PowerShell and stored it in SharePoint for the purposes of quantifying Power BI adoption. In Part 2 we will transform and visualize the Audit Log data using Power BI.
We’ll focus on: connecting and transforming the Audit Data from JSON format into a more usable state and create a couple of the measures that can help to illustrate Power BI adoption.
Part 2 – SharePoint and Power BI
The following steps are executed using Power BI Desktop.
Your “Advance Editor” entries should look similar to those listed below.
let Source = SharePoint.Files("Your SharePoint site URL here", [ApiVersion = 15]), #"Filtered Rows" = Table.SelectRows(Source, each ([Extension] = ".csv") and ([Name] = "Audit_Log-2017-10-25.csv" or [Name] = "Audit_Log-2017-11-02.csv")), #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Rows", "Transform File from Query1", each #"Transform File from Query1"([Content])), #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}), #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from Query1"}), #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Query1", Table.ColumnNames(#"Transform File from Query1"(#"Sample File"))), #"Changed Type" = Table.TransformColumnTypes(#"Expanded Table Column1",{{"Source.Name", type text}, {"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}, {"Column5", type text}, {"Column6", type text}, {"Column7", type text}, {"Column8", type text}, {"Column9", type text}, {"Column10", type text}, {"Column11", type text}, {"Column12", type text}, {"Column13", type text}}), #"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Source.Name", "Column1", "Column2", "Column3", "Column4", "Column5", "Column6", "Column7", "Column9", "Column10", "Column11", "Column12", "Column13"}), #"Removed Blank Rows" = Table.SelectRows(#"Removed Columns", each not List.IsEmpty(List.RemoveMatchingItems(Record.FieldValues(_), {"", null}))), #"Promoted Headers" = Table.PromoteHeaders(#"Removed Blank Rows", [PromoteAllScalars=true]), #"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"AuditData", type text}}), #"Parsed JSON" = Table.TransformColumns(#"Changed Type1",{},Json.Document), #"Expanded AuditData" = Table.ExpandRecordColumn(#"Parsed JSON", "AuditData", {"Id", "RecordType", "CreationTime", "Operation", "OrganizationId", "UserType", "UserKey", "Workload", "UserId", "ClientIP", "UserAgent", "Activity", "ItemName", "WorkSpaceName", "DatasetName", "ReportName", "WorkspaceId", "ObjectId", "DatasetId", "ReportId"}, {"AuditData.Id", "AuditData.RecordType", "AuditData.CreationTime", "AuditData.Operation", "AuditData.OrganizationId", "AuditData.UserType", "AuditData.UserKey", "AuditData.Workload", "AuditData.UserId", "AuditData.ClientIP", "AuditData.UserAgent", "AuditData.Activity", "AuditData.ItemName", "AuditData.WorkSpaceName", "AuditData.DatasetName", "AuditData.ReportName", "AuditData.WorkspaceId", "AuditData.ObjectId", "AuditData.DatasetId", "AuditData.ReportId"}), #"Removed Columns1" = Table.RemoveColumns(#"Expanded AuditData",{"AuditData.Id", "AuditData.RecordType", "AuditData.OrganizationId", "AuditData.UserType", "AuditData.UserKey", "AuditData.Workload", "AuditData.ClientIP", "AuditData.UserAgent", "AuditData.WorkspaceId", "AuditData.DatasetId", "AuditData.ReportId"}), #"Renamed Columns" = Table.RenameColumns(#"Removed Columns1",{{"AuditData.CreationTime", "CreationTime"}, {"AuditData.Operation", "Operation"}, {"AuditData.UserId", "UserId"}, {"AuditData.Activity", "Activity"}, {"AuditData.ItemName", "ItemName"}, {"AuditData.WorkSpaceName", "WorkSpaceName"}, {"AuditData.DatasetName", "DatasetName"}, {"AuditData.ReportName", "ReportName"}, {"AuditData.ObjectId", "ObjectId"}}), #"Duplicated Column" = Table.DuplicateColumn(#"Renamed Columns", "CreationTime", "CreationTime - Copy"), #"Renamed Columns2" = Table.RenameColumns(#"Duplicated Column",{{"CreationTime - Copy", "Activity Date"}}), #"Changed Type2" = Table.TransformColumnTypes(#"Renamed Columns2",{{"Activity Date", type datetime}}), #"Extracted Date" = Table.TransformColumns(#"Changed Type2",{{"Activity Date", DateTime.Date, type date}}), #"Removed Errors" = Table.RemoveRowsWithErrors(#"Extracted Date", {"CreationTime"}), #"Removed Duplicates" = Table.Distinct(#"Removed Errors", {"CreationTime"}) in #"Removed Duplicates"
My Power BI Adoption report utilizes:
I’ll show the DAX measures required for unique users and the week over week KPI’s. Everything else in the report is straight forward and your imagination can guide you on how to best layout the report.
Measure: Unique Users
Measure: Unique Users Previous Week
Let’s put it all together now with a KPI showing the unique users performance compared to the previous weeks performance.
KPI: Users Week Over Week
My finished report is pictured below. I’ll revisit the report and the Audit Log data after more data has accumulated with an eye towards seasonal trends, average visits per user per week, “free” vs “pro” license usage and finally reassess my KPI’s to ensure that they continue to quantify adoption in a way that makes sense for my organization.
Thanks for stopping by.
NY
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.