Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
MarcusA
New Member

DFGen2 Unable to create Destination table in Lakehouse

Hi,

trying to create a Destination table in a Lakehouse while using DFGen2 to import data. But I am unable to create tables, both when using Data Destination - Choose destination target dialog, or when running the DFGen2 dataflow. Source is onprem SQL Server using gateway. I've tried one table is single column allowing Null, the other table with 3 columns Not Null. Please see attached error screenshots.

 

DFGen2run_error.PNG

DFGen2_error.PNG

20 REPLIES 20
Element115
Power Participant
Power Participant

@MarcusA Regarding connectivity issues between DF and on-prem DB, here is a new post in which I explain the resolution of my misadventure in this regard: DATAFLOW and ON-PREM DB connectivity solution - Microsoft Fabric Community

 

The long and short of it, what solved it was to ignore what is stated in the documentation here: On-premises data gateway considerations for data destinations in Dataflow Gen2 - Microsoft Fabric | ... 

 

After removing the endpoints from the firewall rules and replacing these rules with only one rule for the MS-SQL Server service, opening port TCP/1433, everyting started to work as it should. 

Hi @MarcusA 

 

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.


Thanks.

Hi @MarcusA 

 

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others.
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread.


Thanks.

Element115
Power Participant
Power Participant

Out of curiosity, (1) is this the first time you try to do this?  Or (2) has it worked before and then suddenly it won't work?

1: First time trying this out

@MarcusA btw, and I should have mentioned this before, since you are trying to extract from an on-prem DB, have you tried the 2 dataflow chained workaround?  Here, Microsoft briefly explains:

 

On-premises data gateway considerations for data destinations in Dataflow Gen2 - Microsoft Fabric | ...

 

This workaround solved the 'Something went wrong' error in the data destination dialog for me, as well as the other errors that would pop up at refresh time (after you publish).

 

 

I've tried the workaround mentioned and it didn't work.

Screenshot, code and details please.

Thanks for your enquiry. My dataflow Gen2 were working well since I've created them last week. It's only from this morning they started failing to refresh. I detached the destinations and they were working well. But couldn't reconnect with the destination lakehouse again.

 

 

I had followed the workaround by creatng a staging dataflow and then connect my dataflow gen2 taking data from that staging dataflow. It still didn't work.  4.png

This again indicates your GW cannot see the LH server (or cannot connect to it for whatever reason).

Please gather the following things:

- Session ID (In the options dialog)

- Gateway logs with "Additional Logging" turned on while the repro is performed.

 

P.S., we are working on a change that will show more details when this error happens

My session ID of the failed WriteDataToDestination is is d95209b8-dd1a-439d-9eb9-697cab75c5cf

 

DataflowID is 795512cf-8e8a-4034-8f29-4641664c8bcb

 

Tried with the workaround mentioned (staging dataflow extracting data from SharePoint & Second dataflow writing that to destinaion lakehouse), didn't work at all.

 

The same error not only affected my dataflows writing to the lakehouse/Warehouse, but also occurred with my powerbi connection to dataflow Gen2 & Lakehouse tables.

 

5a.png6.png8.png7.png

 

I had just migrated significant portion of my ETL pipelines in fabric dataflow gen2 & Lakehouse, it was working perfectly fine for a week until this suddenly happened, ruining my efforts. 

 

I'm a business analyst with little understanding of backend security protocols & gateways, therefore, it'd be helpful to have a clearer solution.  

@Yusuf7 this seems to be a different issue than what's reported on this thread. First of all, your dataflow does not use a GW, so the error is entirely on the service side and not related to any firewall rules\etc. Secondly, I can see that your dataflow is now refreshing successfully again - can you please confirm this error went away.

 

FWIW the error that you reported above is related to your destination lakehouse. Suddenly it became unavailable and then came back online a couple days later. I'll engage the LH team to try to understand what happened to it during the outage period.

Hi

Thanks for your response. Yes the WriteDataToDestination problem for the dataflow along with all the mentioned problems had been automatically resolved and back to normal after two days from when they’ve started (Friday 19th April) all of a sudden.

Our customer reportings and ETL processes are getting migrated to fabric warehouse and lakehouses through data factory experience, and these kind of outages potentially happening again makes myself a bit worried.


I had experienced the same problem mentioned in the thread (screenshot attached, couldn’t select a destination). Besides, all Fabric items were inaccessible from powerbi desktop. Dataflow gen2 items were inaccessible and could write to the destination lake house/warehouses.

I had used the workaround mentioned in this Microsoft article, but it didn’t fix anything (using a staging and destination injesting dataflow as mentioned). https://learn.microsoft.com/en-us/fabric/data-factory/gateway-considerations-output-destinations#sol...


I appreciate your help on this issue and hope that the outage won’t randomly occur again.

 

 

IMG_0064.png

Unfortunately this means Fabric is neither ready for real-time or near real-time data processing nor for scenarios that require a 99.999% uptime guarantee.  This is bad news especially for businesses in the financial industry that need to keep a live finger (and not a sleepy finger) on the pulse of the market, and other time critical business cases.  


I can imagine it is not a trivial challenge for Microsoft to provide such a SLA. Nonetheless, perhaps the marketing department should take its cue from the engineering department, yes, before putting the cart in front of the horse?

OK well... if nothing has changed when it stopped working, then it's up to Microsoft to step in here. By the way, we can still see the names you blacked out when we display your screenshot in full screen, the black color is slightly transparent  😉

My bad 😞 Thanks for lettig me know though!!

Have tried the workaround, did not work for us.

We need more details @MarcusA to help you better.

 

Can you see data coming in in the first DF?

 

Did you Enable staging on the query in the first DF and disabled it on the query in the second DF?  

 

Did you set a data destination on the query in the second DF BUT NOT on the query in the first DF?  

pqian_MSFT
Employee
Employee

It seems like your gateway is rejecting the TLS connection to the lakehouse server (while having line of sight). Check the following things:

- You have TLS 1.2+ enabled on the GW

- You can validate server certificate chains (I've seen cases where the GW has no access to certain certificate authorities and thus cannot validate the chain)

Using TSL 1.2

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayFBCUpdateCarousel

Fabric Monthly Update - May 2024

Check out the May 2024 Fabric update to learn about new features.