Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Ray_Brosius
Helper III
Helper III

Power BI Get Activity REST API

I had obtained this "code" from the BI Elite team via a very cool Youtube Series on how to build a Power BI Usage Repot.

This was working an jsut stopped some months ago.

(** is there a way to upload a file to this forum?  I can upload a PBIX file that has all the code that then would just need to be updated with specific app id/client secret and username/password credentials **) 

 

I am having trouble executing the Power BI admin REST APIs -

https://api.powerbi.com/v1.0/myorg/admin/activityevents 

https://api.powerbi.com/v1.0/myorg/admin/admin/groups

 

I am able to get an access token but when I then call the above REST APIs I get  an error:  

Expression.Error: Access to the resource is forbidden.

 

Below is the "code" i use.  This used to work ... and to get around this error I go to the Power BI Rest API pages and "try" the get activity events which asks me to login and then provides a bearer token.  I plug that token into my code rather than getting a token via the function i created ... 

 

The code in the Activity Table basically calls teh Get Activity Function and passes the returned value from the Get Access Token Function.  I know the Get access token function is returning a Token but seems that token does not work to call the Get Activity endpoint.  

 

Get Access Token FUNCTION
let
    Source = () =>
let
  body = "grant_type=password&resource=https://graph.microsoft.com&
  username=USERNAME@MS365DOMAIN.com&password=PASSWORD&
  client_id=MYCLIENT_ID&
  secret_id=MY_SECRET_ID&
  client_secret=MY_CLIENT_SECRET",
 
  Data=Json.Document(Web.Contents("https://login.microsoftonline.com/MY_TENNANT_ID/oauth2/token", [Headers=[#"Content-Type"="application/x-www-form-urlencoded"], Content=Text.ToBinary(body)])),
    access_token = Data[access_token]
    
in
    access_token
in
    Source
****************************************************************************
****************************************************************************
****************************************************************************
Get Activity FUNCTION
let
    Source = (accessToken as text, optional startDate as text, optional endDate as text, optional continuationToken as text, optional loop as number, optional data as list) =>
let
Source = Json.Document(Web.Contents("https://api.powerbi.com/v1.0/myorg/",
[
    RelativePath=if loop = 0
        then "admin/activityevents?startDateTime=" & startDate & "&endDateTime=" & endDate
        else "admin/activityevents?continuationToken='" & continuationToken & "'", 
    Headers=[Authorization="Bearer " & accessToken]
    
]
)),
token = Source[continuationToken],
currentData = Source[activityEventEntities],
appendedData = List.Combine({data, currentData}),
loopNum = loop + 1,
output = 
if token is null or loopNum > 100
then appendedData
else @#"GET Activity"(accessToken, "", "", token, loopNum, appendedData)
in
output
in
    Source
****************************************************************************
****************************************************************************
****************************************************************************
ACTIVITY TABLE
let
Source = List.Dates,
#"Invoked FunctionSource" = Source(Date.AddDays(DateTime.Date(DateTime.FixedLocalNow()), -30), 30, #duration(1, 0, 0, 0)),
#"Converted to Table" = Table.FromList(#"Invoked FunctionSource", Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Renamed Columns" = Table.RenameColumns(#"Converted to Table",{{"Column1", "Date"}}),
#"Added Custom" = Table.AddColumn(#"Renamed Columns", "StartDate", each "'" & Date.ToText([Date], "YYYY-MM-DD") & "T00:00:00.000Z" & "'"), 
#"Added Custom1" = Table.AddColumn(#"Added Custom", "EndDate", each "'" & Date.ToText([Date], "YYYY-MM-DD") & "T23:59:59.999Z" & "'"),
 
#"Invoked Custom Function" = Table.AddColumn(#"Added Custom1", "GETActivity", each #"GET Activity"(#"GET Access Token"(), [StartDate], [EndDate], null, 0, {})),
 
 
#"Expanded GETActivity" = Table.ExpandListColumn(#"Invoked Custom Function", "GETActivity"),
#"Expanded GETActivity1" = Table.ExpandRecordColumn(#"Expanded GETActivity", "GETActivity", 
{
"Id", "RecordType", "CreationTime", "Operation", "OrganizationId", "UserType", "UserKey", "Workload", "UserId", "ClientIP", "UserAgent", "Activity", "ItemName", "WorkSpaceName", "DatasetName", "CapacityId", "CapacityName", "WorkspaceId", "ObjectId", "DatasetId", "DataConnectivityMode", "ArtifactId", "ArtifactName", "IsSuccess", "RequestId", "ActivityId", "RefreshType", "LastRefreshTime", "ArtifactKind", "RefreshEnforcementPolicy", "DataflowId", "DataflowName", "DataflowAccessTokenRequestParameters", "DataflowType", "CustomVisualAccessTokenResourceId", "CustomVisualAccessTokenSiteUri", "ImportId", "ImportSource", "ImportType", "ImportDisplayName", "ReportName", "ReportId", "ReportType", "DistributionMethod", "ConsumptionMethod"
},
{
"Id", "RecordType", "CreationTime", "Operation", "OrganizationId", "UserType", "UserKey", "Workload", "UserId", "ClientIP", "UserAgent", "Activity", "ItemName", "WorkSpaceName", "DatasetName", "CapacityId", "CapacityName", "WorkspaceId", "ObjectId", "DatasetId", "DataConnectivityMode", "ArtifactId", "ArtifactName", "IsSuccess", "RequestId", "ActivityId", "RefreshType", "LastRefreshTime", "ArtifactKind", "RefreshEnforcementPolicy", "DataflowId", "DataflowName", "DataflowAccessTokenRequestParameters", "DataflowType", "CustomVisualAccessTokenResourceId", "CustomVisualAccessTokenSiteUri", "ImportId", "ImportSource", "ImportType", "ImportDisplayName", "ReportName", "ReportId", "ReportType", "DistributionMethod", "ConsumptionMethod"
}
),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded GETActivity1",
{
{"Date", type date},{"StartDate", type text}, {"EndDate", type text}, {"Id", type text}, {"RecordType", Int64.Type}, {"CreationTime", type datetime}, {"Operation", type text}, {"OrganizationId", type text}, {"UserType", Int64.Type}, {"UserKey", type text}, {"Workload", type text}, {"UserId", type text}, {"ClientIP", type text}, {"UserAgent", type text}, {"Activity", type text}, {"ItemName", type text}, {"WorkSpaceName", type text}, {"DatasetName", type text}, {"ReportName", type text}, {"WorkspaceId", type text}, {"ObjectId", type text}, {"DatasetId", type text}, {"ReportId", type text}, {"IsSuccess", type logical}, {"ReportType", type text}, {"RequestId", type text}, {"ActivityId", type text}, {"DistributionMethod", type text}, {"ImportId", type any}, {"ImportSource", type any}, {"ImportType", type any}, {"ImportDisplayName", type any}
}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"StartDate", "EndDate"}),
#"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"UserId", "Email"}}),
#"Filtered Rows" = Table.SelectRows(#"Renamed Columns1", each ([Id] <> null))
in 
#"Filtered Rows"
1 ACCEPTED SOLUTION
Ray_Brosius
Helper III
Helper III

I was able to solve the problem.

1) I setup the Azure APP to be a member of the security group that is configured in the PBI Admin Console to have access to the REST APIs

2) I have a PBI file that can go and get the last 30 days of activity, but as is mentioned here that was always just a rolling 30 day window.  IF that works for you then that is an option for sure.. 

3) I edited the "GET TOKEN" function to the below: ..

         () =>
         let
        body = " grant_type=client_credentials&
                   resource=https://analysis.windows.net/powerbi/api&
                   client_id=CLIENT_ID&
                  username=USERNAME&
                 password=PASSWORD&
                client_secret=CLIENT_SECRET",
                Data=Json.Document(
                             Web.Contents("https://login.microsoftonline.com/YOUR_TENNANT_ID/oauth2/token/",
                                    [ Headers=[#"Content-Type"="application/x-www-form-urlencoded"],
                                     Content=Text.ToBinary(body)
                                    ])),
         access_token = Data[access_token]
in
         access_token
 
4) Lastly and the key to this is I found an excellent source for the Python code to build a process that will get the activities from the day prior and save the results into a csv file.  I edited that code to save the data to our SNOWFLAKE database table.
https://pbi-guy.com/2022/03/10/power-bi-and-activity-logs-with-python/
 
here is the Python code .. you will need to replace with your parameters as appropriate.

 

 

#Import necessary libraries
import msal
import requests
import json
import pandas as pd
from datetime import date, timedelta
import snowflake.connector
import sqlalchemy

# Function to insert a row into the Snowflake table
def insert_row(row):
    sql_insert = f"INSERT INTO DATABASE.SCHEMA.TABLE(Id, RecordType, CreationTime, Operation, OrganizationId, UserType, UserKey, Workload, UserId, ClientIP, UserAgent, Activity, IsSuccess, RequestId, ActivityId, ItemName, WorkSpaceName, DatasetName, ReportName, WorkspaceId, ObjectId, DatasetId, ReportId, ReportType, DistributionMethod, ConsumptionMethod) VALUES ('{row['Id']}','{row['RecordType']}','{row['CreationTime']}','{row['Operation']}','{row['OrganizationId']}','{row['UserType']}','{row['UserKey']}','{row['Workload']}','{row['UserId']}','{row['ClientIP']}','{row['UserAgent']}','{row['Activity']}','{row['IsSuccess']}','{row['RequestId']}','{row['ActivityId']}','{row['ItemName']}','{row['WorkSpaceName']}','{row['DatasetName']}','{row['ReportName']}','{row['WorkspaceId']}','{row['ObjectId']}','{row['DatasetId']}','{row['ReportId']}','{row['ReportType']}','{row['DistributionMethod']}','{row['ConsumptionMethod']}')"
    cursor.execute(sql_insert)

#Set SnowFlake parameters and create Connection
con = snowflake.connector.connect(
    account="SNOWFLAKE ACCOUNT",
    user= "USER",
    password= "PASSWORD",
    warehouse= "WAREHOUSE",
    database= "DATABASE",
    schema= "SCHEMA"
    )

#Get yesterdays date and convert to string
activityDate = date.today() - timedelta(days=1)
activityDate = activityDate.strftime("%Y-%m-%d")

# Replace with your Azure AD app details
client_id = 'CLIENT_ID'
client_secret = 'CLIENT_SECRET'
tenant_id = 'TENNANT_ID'
authority_url = "https://login.microsoftonline.com/YOURDOMAIN"
scope = ["https://analysis.windows.net/powerbi/api/.default"]

#Set Power BI REST API to get Activities for today
url = "https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='" + activityDate + "T00:00:00'&endDateTime='" + activityDate + "T23:59:59'"


#Use MSAL to grab token
app = msal.ConfidentialClientApplication(client_id, authority=authority_url, client_credential=client_secret)
result = app.acquire_token_for_client(scopes=scope)

#Get latest Power BI Activities
if 'access_token' in result:
    access_token = result['access_token']
    header = {'Content-Type':'application/json', 'Authorization':f'Bearer {access_token}'}
    api_call = requests.get(url=url, headers=header)
    
    #Specify empty Dataframe with all columns
    column_names = ['Id', 'RecordType', 'CreationTime', 'Operation', 'OrganizationId', 'UserType', 'UserKey', 'Workload', 'UserId', 'ClientIP', 'UserAgent', 'Activity', 'IsSuccess', 'RequestId', 'ActivityId', 'ItemName', 'WorkSpaceName', 'DatasetName', 'ReportName', 'WorkspaceId', 'ObjectId', 'DatasetId', 'ReportId', 'ReportType', 'DistributionMethod', 'ConsumptionMethod']
    df = pd.DataFrame(columns=column_names)

    #Set continuation URL
    contUrl = api_call.json()['continuationUri']
    
    #Get all Activities for first hour, save to dataframe (df1) and append to empty created df
    result = api_call.json()['activityEventEntities']
    df1 = pd.DataFrame(result)
    pd.concat([df, df1])

    #Call Continuation URL as long as results get one back to get all activities through the day
    while contUrl is not None:        
        api_call_cont = requests.get(url=contUrl, headers=header)
        contUrl = api_call_cont.json()['continuationUri']
        result = api_call_cont.json()['activityEventEntities']
        df2 = pd.DataFrame(result)
        df = pd.concat([df, df2])
    
    #Set Cursor for Snowflake 
    cursor = con.cursor()  
    # Apply the function to each row of the DataFrame
    df.apply(insert_row, axis=1)
    
    # Commit the transaction to save the changes
    con.commit()
    con.close()
    

 

 

View solution in original post

6 REPLIES 6
Ray_Brosius
Helper III
Helper III

I was able to solve the problem.

1) I setup the Azure APP to be a member of the security group that is configured in the PBI Admin Console to have access to the REST APIs

2) I have a PBI file that can go and get the last 30 days of activity, but as is mentioned here that was always just a rolling 30 day window.  IF that works for you then that is an option for sure.. 

3) I edited the "GET TOKEN" function to the below: ..

         () =>
         let
        body = " grant_type=client_credentials&
                   resource=https://analysis.windows.net/powerbi/api&
                   client_id=CLIENT_ID&
                  username=USERNAME&
                 password=PASSWORD&
                client_secret=CLIENT_SECRET",
                Data=Json.Document(
                             Web.Contents("https://login.microsoftonline.com/YOUR_TENNANT_ID/oauth2/token/",
                                    [ Headers=[#"Content-Type"="application/x-www-form-urlencoded"],
                                     Content=Text.ToBinary(body)
                                    ])),
         access_token = Data[access_token]
in
         access_token
 
4) Lastly and the key to this is I found an excellent source for the Python code to build a process that will get the activities from the day prior and save the results into a csv file.  I edited that code to save the data to our SNOWFLAKE database table.
https://pbi-guy.com/2022/03/10/power-bi-and-activity-logs-with-python/
 
here is the Python code .. you will need to replace with your parameters as appropriate.

 

 

#Import necessary libraries
import msal
import requests
import json
import pandas as pd
from datetime import date, timedelta
import snowflake.connector
import sqlalchemy

# Function to insert a row into the Snowflake table
def insert_row(row):
    sql_insert = f"INSERT INTO DATABASE.SCHEMA.TABLE(Id, RecordType, CreationTime, Operation, OrganizationId, UserType, UserKey, Workload, UserId, ClientIP, UserAgent, Activity, IsSuccess, RequestId, ActivityId, ItemName, WorkSpaceName, DatasetName, ReportName, WorkspaceId, ObjectId, DatasetId, ReportId, ReportType, DistributionMethod, ConsumptionMethod) VALUES ('{row['Id']}','{row['RecordType']}','{row['CreationTime']}','{row['Operation']}','{row['OrganizationId']}','{row['UserType']}','{row['UserKey']}','{row['Workload']}','{row['UserId']}','{row['ClientIP']}','{row['UserAgent']}','{row['Activity']}','{row['IsSuccess']}','{row['RequestId']}','{row['ActivityId']}','{row['ItemName']}','{row['WorkSpaceName']}','{row['DatasetName']}','{row['ReportName']}','{row['WorkspaceId']}','{row['ObjectId']}','{row['DatasetId']}','{row['ReportId']}','{row['ReportType']}','{row['DistributionMethod']}','{row['ConsumptionMethod']}')"
    cursor.execute(sql_insert)

#Set SnowFlake parameters and create Connection
con = snowflake.connector.connect(
    account="SNOWFLAKE ACCOUNT",
    user= "USER",
    password= "PASSWORD",
    warehouse= "WAREHOUSE",
    database= "DATABASE",
    schema= "SCHEMA"
    )

#Get yesterdays date and convert to string
activityDate = date.today() - timedelta(days=1)
activityDate = activityDate.strftime("%Y-%m-%d")

# Replace with your Azure AD app details
client_id = 'CLIENT_ID'
client_secret = 'CLIENT_SECRET'
tenant_id = 'TENNANT_ID'
authority_url = "https://login.microsoftonline.com/YOURDOMAIN"
scope = ["https://analysis.windows.net/powerbi/api/.default"]

#Set Power BI REST API to get Activities for today
url = "https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='" + activityDate + "T00:00:00'&endDateTime='" + activityDate + "T23:59:59'"


#Use MSAL to grab token
app = msal.ConfidentialClientApplication(client_id, authority=authority_url, client_credential=client_secret)
result = app.acquire_token_for_client(scopes=scope)

#Get latest Power BI Activities
if 'access_token' in result:
    access_token = result['access_token']
    header = {'Content-Type':'application/json', 'Authorization':f'Bearer {access_token}'}
    api_call = requests.get(url=url, headers=header)
    
    #Specify empty Dataframe with all columns
    column_names = ['Id', 'RecordType', 'CreationTime', 'Operation', 'OrganizationId', 'UserType', 'UserKey', 'Workload', 'UserId', 'ClientIP', 'UserAgent', 'Activity', 'IsSuccess', 'RequestId', 'ActivityId', 'ItemName', 'WorkSpaceName', 'DatasetName', 'ReportName', 'WorkspaceId', 'ObjectId', 'DatasetId', 'ReportId', 'ReportType', 'DistributionMethod', 'ConsumptionMethod']
    df = pd.DataFrame(columns=column_names)

    #Set continuation URL
    contUrl = api_call.json()['continuationUri']
    
    #Get all Activities for first hour, save to dataframe (df1) and append to empty created df
    result = api_call.json()['activityEventEntities']
    df1 = pd.DataFrame(result)
    pd.concat([df, df1])

    #Call Continuation URL as long as results get one back to get all activities through the day
    while contUrl is not None:        
        api_call_cont = requests.get(url=contUrl, headers=header)
        contUrl = api_call_cont.json()['continuationUri']
        result = api_call_cont.json()['activityEventEntities']
        df2 = pd.DataFrame(result)
        df = pd.concat([df, df2])
    
    #Set Cursor for Snowflake 
    cursor = con.cursor()  
    # Apply the function to each row of the DataFrame
    df.apply(insert_row, axis=1)
    
    # Commit the transaction to save the changes
    con.commit()
    con.close()
    

 

 

lbendlin
Super User
Super User

That's not how you are supposed to consume the activity reports.  Keep in mind they are ephemeral and will disappear after 30 days.  You are supposed to download the JSON extracts daily (via PowerShell), store them in a central repository and maybe feed a dataflow from the files - or read them in to datasets directly.

Thanks for your reply

 

Yes I know that...   we are working in parallel to build scripts to run a daily extrace and utilize our Snowflake instance to save.   I put this "version" here as our first go around on this was to just get the last 30 days worth of Activity.. 

 

Your response does not help solve the problem I posted.    

 

The essence of the problem is that while I can get an Access Token, but that token does not  work as the bearer token when calling the Admin-Get Activity endpoint..   If I go to the Power BI Rest API page https://learn.microsoft.com/en-us/rest/api/power-bi/admin/get-activity-events and "try" 

I can take the Bearer token that is provided and plug it in to my Power BI code and it all works.. 

 

So why does that bearer token work and not the one from my code at the top here?

 

This could be a problme with our Azure APP but hoping someone on here can comment if something has changed in the Authentication process / library to get a proper Access Token.. 

According to Chris Webb you can only make this work with custom connectors, not with regular Power Query script.

This used to work all by itself and I can make it work as I described by manually getting an Auth Token from the Power BI Rest API documentation pages by running it from there.. 

 

And even then, the same "logic" and endpoints need to be used in Python code to get activity data from Powe BI.. 

when did it stop working all by itself?

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

Top Kudoed Authors