Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
Have powershell script which currently writes output to .csv on local drive and uses Get-PowerBIReport cmdlet amongst others
Anyone know how to instead to write to a sharepoint location instead.
We currently have to save this manullay to sharepoint location and then run power bi report.
Would be better if we coudl write directly to sharepoint location in powershell.
Thanks
Hi @po,
I'd like to suggest you combine below function scripts about upload files to SharePoint to your codes and add remove the local files after Finished the operations.
Upload on-premises content to SharePoint using PowerShell cmdlets
Regards,
Xiaoxin Sheng
Hi,
Thanks fro replies.
Not sure how woudl do this - do you have firther info
"Synchronize your Sharepoint Document library with a local folder on your PC. Then point the Powershell script to that local folder. OneDrive will do the rest."
Will have a read thorugh the document.
Thanks
Hi @po,
It seems like the sample is synced based on a specific folder and SharePoint schedule. If it does not meet to your requirement, you can try to use following code, it manually uploads the file to SharePoint based on 'filestream':
SharePoint Online: Upload File to Sub-Folder using PowerShell
BTW, I don't think you can cancel the export file operation and directly output generated file to SharePoint. (you need a temp file to receive exported records and upload them)
Regards,
Xiaoxin Sheng
Hi,
I am using 2 scripts which work indepedntly - one to get audit of workspace etc and one to upload to shreapoint.
However when I try and either have a powerbi_ctrl script which looks to run these tgether or put in the part to uploda to sharepoint afterwards get message below The collection has not been initialized.
Script complete: 07/07/2020 10:01:52 AM
Upload to sharepoint start
format-default : The collection has not been initialized. It has not been requested or the request has not been executed. It may need to be expli
+ CategoryInfo : NotSpecified: (:) [format-default], CollectionNotInitializedException
+ FullyQualifiedErrorId : Microsoft.SharePoint.Client.CollectionNotInitializedException,Microsoft.PowerShell.Commands.FormatDefaultCommand
script running below.
How can we get these 2 parts (first to get audit of workspaces and second to upload sharepoint running sequentailly either within same script which woudl be best or a powershell script which calls the bothe parts).
otherwsie have to schedule these to run independently which less ideal.
#
# list all workspaces, reports and dashboards across every workspace. Tie "current date" is added to each row...Power BI will then be able to tell thea "first date"
# that something appeared.
#
Write-Host "Starting script:" (Get-Date).ToString('MM/dd/yyyy hh:mm:ss tt')
# connect to PBI service using the service account
$User = "<user>"
$PWord = ConvertTo-SecureString -String "<pwd>" -AsPlainText -Force
$UserCredential = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $User, $PWord
Connect-PowerBIServiceAccount -Credential $UserCredential
$logbase = "C:\pbi_pshell\files\" # *** Directory or UNC path where you want the output to go to***"
#################################
# Workspaces #
#################################
#
# Scott notes:
# Workspace type "Workspace" is new workspace experience
# "Group" is old workspace experience
# "PersonalGroup" is "My Workspace" for end users
#
# State "Active"
# "Removing" these are already deleted...
# "Deleted"
# "Deprovisioning failed"
#
#
# Things blow up later (like listing datasets, datasources, etc.) if states other than "Active" are in the list...so I'm going to filter to active only...
# Also may remove personal workspaces for now just to improve speed...
#
#
Write-Host "******* Exporting Workspaces *****"
$Workspaces = Get-PowerBIWorkspace -Scope Organization -All | where state -eq "Active" | where type -ne "PersonalGroup" | where type -eq "Workspace" | Where {($_.Name -NotLike "Personal*")}
# export workspaces
$logpath = $logbase + "workspaces.csv"
$Workspaces | select Id, Name, Type, State, IsReadOnly, IsOrphaned, CapacityId | Export-Csv -Path $logpath -NoTypeInformation
#################################
# Dashboards #
#################################
# Note - similar to datasets, we can't just grab dashboards - because the resulting object doesn't tell which workspace it is in. So going to have to loop over each workspace
# and grab just the dashboards in it to make that link.
#$logpath = $logbase + "dashboards.csv"
#$Dashboards =
#ForEach ($workspace in $Workspaces)
# {
# Write-Host "Writing dashboards...on workspace: " $workspace.Name
# ForEach ($dashboard in (Get-PowerBIDashboard -Scope Organization -WorkspaceId $workspace.Id))
# {
# [pscustomobject]@{
# WorkspaceID = $workspace.Id
# WorkspaceName = $workspace.Name
# DashboardID = $dashboard.Id
# DashboardName = $dashboard.Name
# }
# }
# }
#$Dashboards | Export-Csv -Path $logpath -NoTypeInformation
#################################
# Reports #
#################################
# same as dashboards and datasets - loop over each workspace and list the reports in it.
$logpath = $logbase + "reports.csv"
$Reports =
ForEach ($workspace in $Workspaces)
{
Write-Host "Writing reports...on workspace: " $workspace.Name
ForEach ($report in (Get-PowerBIReport -Scope Organization -WorkspaceId $workspace.Id))
{
[pscustomobject]@{
WorkspaceID = $workspace.Id
WorkspaceName = $workspace.Name
ReportID = $report.Id
ReportName = $report.Name
ReportURL = $report.WebUrl
ReportDatasetID = $report.DatasetId
}
}
}
$Reports | Export-Csv -Path $logpath -NoTypeInformation
#################################
# Apps #
#################################
$url = "apps"
$Apps = (ConvertFrom-Json (Invoke-PowerBIRestMethod -Url $url -Method Get)).value
# export apps
$logpath = $logbase + "apps.csv"
$Apps | Export-Csv -Path $logpath -NoTypeInformation
# export app dashboards
$logpath = $logbase + "app_dashboards.csv"
$AppDashboards =
ForEach ($app in $Apps)
{
$url = "apps/" + $app.Id + "/dashboards"
$app_dashboards = (ConvertFrom-Json (Invoke-PowerBIRestMethod -Url $url -Method Get)).value
ForEach ($dashboard in $app_dashboards)
{
[pscustomobject]@{
AppID = $app.Id
AppName = $app.Name
DashboardID = $dashboard.Id
DashboardName = $dashboard.displayName
IsReadOnly = $dashboard.isReadOnly
}
}
}
$AppDashboards | Export-Csv -Path $logpath -NoTypeInformation
# export app reports
$logpath = $logbase + "app_reports.csv"
$AppReports =
ForEach ($app in $Apps)
{
$url = "apps/" + $app.Id + "/reports"
$app_reports = (ConvertFrom-Json (Invoke-PowerBIRestMethod -Url $url -Method Get)).value
ForEach ($report in $app_reports)
{
[pscustomobject]@{
AppID = $app.Id
AppName = $app.Name
ReportID = $report.Id
ReportName = $report.Name
ReportURL = $report.webURL
}
}
}
$AppReports | Export-Csv -Path $logpath -NoTypeInformation
Disconnect-PowerBIServiceAccount
Write-Host "Script complete:" (Get-Date).ToString('MM/dd/yyyy hh:mm:ss tt')
#Powershell script to copy files from Shared drive to SharePoint Online
Write-Host "Upload to sharepoint start"
# Variable - Change the parameter as it need
$O365ServiceAccount="<user_account>"# Your User name
$O365ServiceAccountPwd="<pwd>"#Your Password - You can use Get-credentials command as well
$SharePointSiteURL="https://<company>.sharepoint.com/sites/BI" # Change this SharePOint Site URL
$SharedDriveFolderPath="C:\pbi_pshell\files" # Change this Network Folder path
$SharePointFolderPath="Shared Documents/Power BI/Workspace Inventory" # Change the Document Library and Folder path
#Ends
[SecureString]$SecurePass = ConvertTo-SecureString $O365ServiceAccountPwd -AsPlainText -Force
[System.Management.Automation.PSCredential]$PSCredentials = New-Object System.Management.Automation.PSCredential($O365ServiceAccount, $SecurePass)
#Connecting to SharePoint Online site
Try {
Connect-PnPOnline -Url $SharePointSiteURL -Credentials $PSCredentials
Get-ChildItem $SharedDriveFolderPath | Where-object {$_.PSIsContainer -ne $True} | Sort-Object CreationTime -Descending | ForEach-Object {
Add-PnPFile -Path $_.FullName -folder $SharePointFolderPath #Add File to SharePoint Document Library folder
}
}
catch {
write-host "Error: $($_.Exception.Message)" -foregroundcolor Red
}
Write-Host "Upload to sharepoint complete"
Hi,
Just to let you know managed to gte this working.
Added bit in bold - looks like some other people have had similar issue with commands appearing before Add-PnPFile
$file = Add-PnPFile -Path $_.FullName -folder $SharePointFolderPath #Add File to SharePoint Document Library folder
Thanks
Synchronize your Sharepoint Document library with a local folder on your PC. Then point the Powershell script to that local folder. OneDrive will do the rest.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
17 | |
2 | |
2 | |
1 | |
1 |
User | Count |
---|---|
22 | |
2 | |
2 | |
2 | |
2 |