Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hello,
We are testing Power BI to see what details it can provide about folder/subfolder/files.
So, pretty simply, pointing to a folder (in Get Data) and trying to get info about the contents of those folders. We do not need to interrogate the contents of those underlying files. Just need to know attributes/metadata about the file - name, folder, extension, size, last time accessed, etc.
This is extremely slow and takes several hours to run. All we want is the metadata/properties of each file. Is there a setting or way to do this so it runs quicker? We do have ~40k underlying files but it took ~6 hours to run and load. This was just one folder structure.
Thanks for any help you can provided,
Dan
Solved! Go to Solution.
Use PowerShell, this isn't really a use case Power BI was designed for. I've written multi-threaded PowerShell to scan folder structures and that's a much better tool for the job. User PowerShell to generate a CSV file of the metadata and import that into Power BI. You are trying to use a miter saw to hammer a nail.
Use PowerShell, this isn't really a use case Power BI was designed for. I've written multi-threaded PowerShell to scan folder structures and that's a much better tool for the job. User PowerShell to generate a CSV file of the metadata and import that into Power BI. You are trying to use a miter saw to hammer a nail.
@Greg_Deckler Do you mean using PowerShell to scan Excel files and convert into metadata CSV files?
Because I have the exact issue. Huge Excel files as data, updating monthly, and powerbi refreshes all the data sources from the beginning which took forever
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
102 | |
53 | |
21 | |
12 | |
12 |