Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
bdunzweiler
Helper I
Helper I

Dataflows - refresh gives error about AzureStorage.BlobContents matches no exports

I created a dataflow with several custom entities from an on-premises SQL Server and after saving the entities successfully, the refresh is not working.  It is giving this error for all of the custom entities:

 

Error: The import AzureStorage.BlobContents matches no exports. Did you miss a module reference?

 

The gateway is online and is actively used by datasets within the workspace.  Is there any other setup that needs to occur in my org's environment?

1 ACCEPTED SOLUTION

I have the same problem, i am trying to pull a table from my onpremise server to a dataflow , i can save it but when i try to refresh i get this error

 

Error: The import AzureStorage.BlobContents matches no exports. Did you miss a module reference? R

 

i know my data getaway works as i have several dataset already pulling this table from the same db server.

 

any idea

 

 

 

******* Update 15-11-2018 ************

 

I think, i found my problem and its working now.

 

i want to provide my environment and case so people can check if they have the same scenario.

 

1) i have enterprise data gateway installed and working for my power bi tenant, no problem here

2) i  normally use a service account (Window Account) as credential to access my on Premise Data Sources

 now, that service account does not have office 365 setup and power bi access, i use my personal account to create data source in the gateway but i use that service account to access my databases.

 

 

when i created the dataflow, it seems the gateway find the data source already created in my gateway but also use the credential infomation so i dont pass new credential.

 

i dont know how the internals of this process work but i guess the daflow use somehow those credential to do something and it fail,( i dont think the error i am getting , describe well the problem),

 

WorkAround, i have another data gateway and that one dont have the data source created so the dataflow process ask me for credential , when i use my personal credential that have oficce365 and power bi access correctly, it work without problem

 

hope this can help somebody else

 

 

 

View solution in original post

4 REPLIES 4
mdhopkins
Frequent Visitor

I know that I am very late to the party here, but I am having the same issue. The solution noted that I had to not use the data gateway as an option and pass credentials manually. I am not sure how to do this.

 

In reading the responses (@Anonymous , @rmillan ) it seems that the issue is the same for me: I have one set of credentials (Windows/Office) that I use to access the Gateway/PowerBI/SharePoint and another to access the database.

 

Does anyone have any updates on this? It is pretty frustrating. I can see the data in the dataflow when I create it and add the entity, but it fails on refresh. I am not sure if it makes a difference or not, but I created a blank query and am using a native database query to get the data.

Anonymous
Not applicable

Hi @bdunzweiler ,

 

A bit late, however I can confirm @rmillan 's solution.

 

It seems that when you setup the dataflow using the cached credentials from the Data Gateway (i.e. you let the dataflow use the credentials stored by the data gateway), you receive the error "Error: The import AzureStorage.BlobContents matches no exports. Did you miss a module reference?".

 

To fix this, you have to NOT use the data gateway as an option, and instead pass credentials manually. This enabled me to successfully refresh the dataflow and connect to it using Power BI Desktop. During the data load into PBI Desktop it referenced connecting to blob storage, so it appears there is some ancillary work happening in the background when the Dataflow is stored.

 

Regards,

Sean

v-shex-msft
Community Support
Community Support

HI @bdunzweiler,

 

Can you please explain more about this?

How to Get Your Question Answered Quickly

 

Do you mean configure power bi on premise gateway on azure side to handle data dataflow? If this is a case, I'd like to suggest you post this to azure forum to get better support.


You can also take a look at following documents which told about how to configure on premise gateway and export blob dataflow.

Connect to data sources on premises from Azure Logic Apps with on-premises data gateway

Use the Azure Import/Export service to export data from Azure Blob storage

 

If you mean azure streaming analytics job, you can take a look at following link:

Quickstart: Create a Stream Analytics job by using the Azure portal

 

If you mean CDS for Analytics dataflow, maybe you can refer to following link: (current it seems not support azure export job as datasource)

Add data to an entity in Common Data Service for Apps by using Power Query

 

BTW, if your requirement is related to CDS-A, you can post this to powerapp forum for further support.

 

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.

I have the same problem, i am trying to pull a table from my onpremise server to a dataflow , i can save it but when i try to refresh i get this error

 

Error: The import AzureStorage.BlobContents matches no exports. Did you miss a module reference? R

 

i know my data getaway works as i have several dataset already pulling this table from the same db server.

 

any idea

 

 

 

******* Update 15-11-2018 ************

 

I think, i found my problem and its working now.

 

i want to provide my environment and case so people can check if they have the same scenario.

 

1) i have enterprise data gateway installed and working for my power bi tenant, no problem here

2) i  normally use a service account (Window Account) as credential to access my on Premise Data Sources

 now, that service account does not have office 365 setup and power bi access, i use my personal account to create data source in the gateway but i use that service account to access my databases.

 

 

when i created the dataflow, it seems the gateway find the data source already created in my gateway but also use the credential infomation so i dont pass new credential.

 

i dont know how the internals of this process work but i guess the daflow use somehow those credential to do something and it fail,( i dont think the error i am getting , describe well the problem),

 

WorkAround, i have another data gateway and that one dont have the data source created so the dataflow process ask me for credential , when i use my personal credential that have oficce365 and power bi access correctly, it work without problem

 

hope this can help somebody else

 

 

 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors