Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

DataFlow refresh issues with Sharepoint Online folder source

Our DataFlows that are reading Excel and CSV files off of Sharepoint Online folder (document library) have all begun failing to refresh this week.  

Most of the errors seem to indicate some connectivity issue (or at least maintaining it)

Error: Data Source Error : The underlying connection was closed: An unexpected error occurred on a send... RootActivityId = 11af763a-f083-413e-a072-996fe7093913.Param1 = The underlying connection was closed: An unexpected error occurred on a send. Request ID: a925320e-7f06-a2d3-47ea-8558c6ff5c82.

 

Its's been incredibly frustrating dealing with some of these DataFlow bugs lately -- ie. hidden files bug and validating queries getting hung up.  

Status: New
Comments
v-qiuyu-msft
Community Support

Hi @Anonymous, 

 

I tested on my side, it's able to refresh the dataflow which connects to SharePoint online folder. Please manually refresh the dataflow to see if the same issue occurs. 

 

Best Regards,
Qiuyun Yu

pat_mecee
Frequent Visitor

Following this thread. Dataflows sourcing Excel files directly via URL now failing.

 

URL is external public website , anon access OK. SharePoint not involved at all. Gateway not involved.

 

"Error: DataSource.Error: The downloaded data is HTML which isn't the expected type. The URL may be wrong or you might not have provided the right credentials to the server."

 

These dataflows all worked yesterday.

 

What else has changed ?

Anonymous
Not applicable

I've tried to attack this problem at many angles.  I've reverted back to the original working code, I started removing objects, I export/import JSON, toggling between compute entities but to no avail.

The only other thing that I can think of that has changed is the number of files in Sharepoint, which keeps growing.  Currently, there are 56 files with a total of 360MB... could there be a limit on the Sharepoint side to allow Power BI Service to read it?

 

v-qiuyu-msft
Community Support

Hi all, 

 

I would suggest you create a support ticket to let engineers look into the issue on your side. 

 

Support Ticket.gif

 

Best Regards,
Qiuyun Yu

anselmojg
Advocate II

We have the same problem since Monday afternoon... Shared the case with Microsoft and so far we do not yet have a solution...Both Power BI and SharePoint engeneering teams looking into it.... Not sure where the connection between Power BI and SharePoint Online breaks. The issue is not exclusive to dataflows, since also datasets with connections to SharePoint Online also fail to refresh. All other datasources (e.g. SAP HANA) working seamlessly in the meantime, which somehow rules out Capacity or Memory limits.....

Hope to get a solution soon!

Anonymous
Not applicable

Same issue here. My dataflows and datasets are failing since Saturday. 

jcullum
Advocate II

We have a similar problem @Anonymous, @anselmojg , @pat_mecee  , a colleague of mine spotted ourshaving problems with "Filter Hidden Files".

 

The power query function is 

Table.SelectRows(#"Filtered rows 1", each [Attributes][Hidden] <> true)
We removed those filters (which are automatically generate via the UI), via the UI and ours now refresh.
 
I have been unable to create a new dataflow that performs a combine on a Sharepoint files.  As i stated the UI is automatically generating the filter on hidden files and it causes any new query to fail, complaining about the hidden attributes. 
 
Here is another link confirming the same.
 
 
 
I am not sure if yours are similar but you can look to see if your dataflows have that filter condition.
 
 
pat_mecee
Frequent Visitor

Hi all,

 

I can confirm that this issue is now with the MS SharePoint team after the Power BI team confirmed our diagnosis last week. Removing the filtered hidden files step in the query has allowed all datasets and dataflows affected to refresh successfully because it was looking for the [attributes]?[hidden]? within this step of automatically generated folder file binary combines.

 

The SharePoint attribute schema was modified last week. Trick is to expand the "attributes" field and view the associated record contents from the first source (SharePoint site or folder) step of the query. You will no longer find any "hidden" record field - which is why all the queries are failing.

 

Getting MS SharePoint team to acknowledge the consequences of their schema change is a different story...... 

Anonymous
Not applicable

@pat_mecee 

 

Thanks for the suggestion. My dataflows are failing and I don't have the "Filter Hidden Files" step in my query. My query reads sharepoint folder without the standard inbuilt steps that are generated when you use the UI.

 

Any other suggestions? 

 

 

GTS_ONE
Advocate II

I'm also seeing failures on Dataflows that do not have the "Filter Hidden Files" step.