Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
dazzerd
Frequent Visitor

Connecting Data Flows to Snowflake

We have a bizarre issue when we try and edit existing dataflows or create new ones:

 

dazzerd_0-1701870183812.png

 

Its erroring because it no longer likes the "-" in the host name. Note existing dataflows built using the same connection method still run and the exact same script still works in desktop. The issue only becomes apparent when we are trying to edit flows.

Have raised internally but wondered if anyone has seen this before?


3 REPLIES 3
dazzerd
Frequent Visitor

Thanks but it's not a snowflake problem. I can connect and run identical queries using Power BI Desktop or a db manager like DBeaver and the same connection strings using the same SQL works just fine. The issue is dataflows.

soniya-01
New Member

  1. Understand Snowflake Architecture: Before connecting data flows to Snowflake, it's essential to understand the basics of Snowflake architecture. Familiarize yourself with Snowflake's concepts such as warehouses, databases, schemas, and tables.

  2. Get Snowflake Credentials: Obtain the necessary credentials to connect to Snowflake. This typically includes the account URL, username, password, and other relevant information.

  3. Choose Data Flow Tool: Identify the data flow tool or platform you're using. Popular tools include Apache NiFi, Apache Kafka, Talend, Apache Airflow, and others. The steps for connecting to Snowflake can vary based on the tool.

  4. Install Drivers or Connectors: Check if your data flow tool requires specific drivers or connectors to connect to Snowflake. Some tools may have native support for Snowflake, while others may require additional plugins.

  5. Configure Connection Settings: Within your data flow tool, locate the settings or configurations related to database connections. Provide the Snowflake credentials and connection details, including the account URL, username, password, and any other required information.

  6. Define Data Flows: Design your data flows within the data flow tool. Specify the source of your data (e.g., other databases, files, streaming sources) and configure the destination to be Snowflake.

  7. Map Data Fields: Ensure that the data fields from your source match the structure of the Snowflake tables where you want to store the data. Perform any necessary data mapping or transformations.

  8. Test the Connection: Before running your data flows in a production environment, perform tests to ensure that the connection to Snowflake is working correctly. Verify that data is being transferred as expected.

  9. Handle Errors and Monitoring: Implement error handling mechanisms within your data flow tool to manage any issues that may arise during data transfer. Set up monitoring and logging to track the performance of your data flows.

  10. Scale for Production: Once you've successfully tested your data flows, scale them for production use. Consider factors like performance, scalability, and security.

aj1973
Community Champion
Community Champion

Hi @dazzerd 

Submit a support ticket to snowflake, they will resolve it asap. 

Regards
Amine Jerbi

If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors