Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
dualsweat
Frequent Visitor

Dynamic Source Data wtih SharePoint (Question)

Hello All,

 

              I have created a folder in my Sharepoint server called "Source_Data". Everyday a new subfolder is created with a .csv file from the previous day. So, my Power Bi source data comes from a directory structure such as below:

 

::/SharePoint/Power_BI/Source_Data/Jan_01_2018/01012018.csv

::/SharePoint/Power_BI/Source_Data/Jan_02_2018/01022018.csv

::/SharePoint/Power_BI/Source_Data/Jan_03_2018/01032018.csv

::/SharePoint/Power_BI/Source_Data/Jan_04_2018/01042018.csv

::/SharePoint/Power_BI/Source_Data/Jan_05_2018/01052018.csv

 

              Everyday a new folder and .csv file is added to my Power Bi "Source Data" folder. 

 

My Problem:

 

             Power BI only seems to display and recognize the first 10 days of .csv's that I entered initially. Any subsequent days or .csvs that are autmatically downloaded and added to the SharePoint directory do not appear in my reports. I have "Refreshed" the data and even scheduled a refresh that ran successfully. But, no joy. 

 

             What am I missing? Is power BI not able to deal with new files being entered into the "Source Data" directory?

 

Thank You,

 

M--------------

4 REPLIES 4
Salonic
Resolver I
Resolver I

Hello,

 

Could you share your code query from the advanced editor ?

 

Hello, thank you for any help!

 

let
    Source = SharePoint.Files("https://XXXXX.sharepoint.com/sites/powerbi", [ApiVersion = 15]),
    #"Filtered Rows" = Table.SelectRows(Source, each ([Extension] = ".csv")),
    #"Filtered Rows1" = Table.SelectRows(#"Filtered Rows", each not Text.Contains([Name], "Summary")),
    #"Filtered Rows2" = Table.SelectRows(#"Filtered Rows1", each Text.Contains([Name], "Bts")),
    #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Rows2", "Transform File from Query1", each #"Transform File from Query1"([Content])),
    #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
    #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from Query1"}),
    #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Query1", Table.ColumnNames(#"Transform File from Query1"(#"Sample File"))),
    #"Changed Type" = Table.TransformColumnTypes(#"Expanded Table Column1",{{"Source.Name", type text}, {"Start Time", type text}, {"End Time", type text}, {"BSC Name", type text}, {"BSC IP Address", type text}, {"Call Setup Success Rate", type number}, {"Call Setup Success Rate - NR", type number}, {"Call Setup Success Rate - DR", type number}, {"Call Success Rate", type number}, {"Call Success Rate - NR", type number}, {"Call Success Rate - DR", type number}, {"CS Erlangs", type number}, {"Total Channel Drops", type number}, {"SDCCH Drop Count", type number}, {"SDCCH Drop Rate", type number}, {"SDCCH Drop Rate - NR", type number}, {"SDCCH Drop Rate - DR", type number}, {"TCH Drop Count", type number}, {"TCH Drop Rate", type number}, {"TCH Drop Rate - NR", type number}, {"TCH Drop Rate - DR", type number}, {"Immediate Assignment Procedure Success Rate", type number}, {"Immediate Assignment Procedure Success Rate - NR", type number}, {"Immediate Assignment Procedure Success Rate - DR", type number}, {"SDCCH Blocking Count", type number}, {"SDCCH Blocking Rate", type number}, {"SDCCH Blocking Rate - NR", type number}, {"SDCCH Blocking Rate - DR", type number}, {"TCH Blocking Count", type number}, {"TCH Blocking Rate", type number}, {"TCH Blocking Rate - NR", type number}, {"TCH Blocking Rate - DR", type number}, {"Number Of Tchf Attempts", type number}, {"Number Of Tchf Success", type number}, {"Number Of Tchh Attempts", type number}, {"Number Of Tchh Success", type number}, {"Number Of SDCCH Assignment Attempts", type number}, {"Number Of Successful SDCCH Assignments", type number}, {"External Handover Success Rate", type number}, {"External Handover Success Rate - NR", type number}, {"External Handover Success Rate - DR", type number}, {"Internal Handover Success Rate", type number}, {"Internal Handover Success Rate - NR", type number}, {"Internal Handover Success Rate - DR", type number}, {"Internal Handover Success Rate Prime", type number}, {"Internal Handover Success Rate Prime - NR", type number}, {"Internal Handover Success Rate Prime - DR", type number}, {"Total Voice Call Minutes", type number}, {"Mean Holding Time In Secs", type number}, {"Mean Holding Time In Secs - NR", type number}, {"Mean Holding Time In Secs - DR", type number}, {"DL TCH RxQual 0", type number}, {"DL TCH RxQual 1", type number}, {"DL TCH RxQual 2", type number}, {"DL TCH RxQual 3", type number}, {"DL TCH RxQual 4", type number}, {"DL TCH RxQual 5", type number}, {"DL TCH RxQual 6", type number}, {"DL TCH RxQual 7", type number}, {"UL TCH RxQual 0", type number}, {"UL TCH RxQual 1", type number}, {"UL TCH RxQual 2", type number}, {"UL TCH RxQual 3", type number}, {"UL TCH RxQual 4", type number}, {"UL TCH RxQual 5", type number}, {"UL TCH RxQual 6", type number}, {"UL TCH RxQual 7", type number}, {"UL TCH RxQual 0To5 Percent", type number}, {"DL TCH RxQual 0To5 Percent", type number}, {"SDCCH8 Blocking Count", type number}, {"Number Of Radio Connection Failures SDCCH4", type number}, {"Number Of Radio Connection Failures SDCCH8", type number}, {"Number Of Radio Connection Failures TCH_H", type number}, {"Number Of Radio Connection Failures TCH_F", type number}, {"Number Of SDCCH4 Assignment Attempts", type number}, {"Number Of SDCCH4 Assignment Success", type number}, {"Number Of SDCCH8 Assignment Attempts", type number}, {"Number Of SDCCH8 Assignment Success", type number}, {"SDCCH4 Blocking Count", type number}, {"TCH_H Blocking Count", type number}, {"TCH_F Blocking Count", type number}, {"Number Of Paging Attempts", type number}, {"Number Of Emergency Calls", type number}, {"Number Of Location Updates", type number}, {"Total Number Of Calls", type number}, {"External HandIn Success Rate", type number}, {"External HandIn Success Rate - NR", type number}, {"External HandIn Success Rate - DR", type number}}),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Changed Type", "Start Time", Splitter.SplitTextByDelimiter("-", QuoteStyle.Csv), {"Start Time.1", "Start Time.2"}),
    #"Split Column by Delimiter1" = Table.SplitColumn(#"Split Column by Delimiter", "End Time", Splitter.SplitTextByDelimiter("-", QuoteStyle.Csv), {"End Time.1", "End Time.2", "End Time.3", "End Time.4", "End Time.5"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter1",{{"Start Time.1", type date}, {"Start Time.2", type time}, {"End Time.1", type date}, {"End Time.2", type time}, {"End Time.3", type text}, {"End Time.4", type text}, {"End Time.5", type text}}),
    #"Renamed Columns" = Table.RenameColumns(#"Changed Type1",{{"Start Time.1", "Start Date"}, {"Start Time.2", "Start Time"}, {"End Time.1", "End Date"}, {"End Time.2", "End Time"}}),
    #"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"End Time.3", "End Time.4", "End Time.5"})
in
    #"Removed Columns"

 

You should try with only the source, before filtering on all the files.

Do you confirm that this problem is concerning the service (powerbi.com) and not Desktop ?

 

 

Thanks Salonic. After reading a number of posts and couments, it seems Sharepoint is not the ideal data source location. We are going to test with Azure and S3 and see how it goes.

 

Thanks!

 

M----------

 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.