Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
We have a bizarre issue when we try and edit existing dataflows or create new ones:
Its erroring because it no longer likes the "-" in the host name. Note existing dataflows built using the same connection method still run and the exact same script still works in desktop. The issue only becomes apparent when we are trying to edit flows.
Have raised internally but wondered if anyone has seen this before?
Thanks but it's not a snowflake problem. I can connect and run identical queries using Power BI Desktop or a db manager like DBeaver and the same connection strings using the same SQL works just fine. The issue is dataflows.
Understand Snowflake Architecture: Before connecting data flows to Snowflake, it's essential to understand the basics of Snowflake architecture. Familiarize yourself with Snowflake's concepts such as warehouses, databases, schemas, and tables.
Get Snowflake Credentials: Obtain the necessary credentials to connect to Snowflake. This typically includes the account URL, username, password, and other relevant information.
Choose Data Flow Tool: Identify the data flow tool or platform you're using. Popular tools include Apache NiFi, Apache Kafka, Talend, Apache Airflow, and others. The steps for connecting to Snowflake can vary based on the tool.
Install Drivers or Connectors: Check if your data flow tool requires specific drivers or connectors to connect to Snowflake. Some tools may have native support for Snowflake, while others may require additional plugins.
Configure Connection Settings: Within your data flow tool, locate the settings or configurations related to database connections. Provide the Snowflake credentials and connection details, including the account URL, username, password, and any other required information.
Define Data Flows: Design your data flows within the data flow tool. Specify the source of your data (e.g., other databases, files, streaming sources) and configure the destination to be Snowflake.
Map Data Fields: Ensure that the data fields from your source match the structure of the Snowflake tables where you want to store the data. Perform any necessary data mapping or transformations.
Test the Connection: Before running your data flows in a production environment, perform tests to ensure that the connection to Snowflake is working correctly. Verify that data is being transferred as expected.
Handle Errors and Monitoring: Implement error handling mechanisms within your data flow tool to manage any issues that may arise during data transfer. Set up monitoring and logging to track the performance of your data flows.
Scale for Production: Once you've successfully tested your data flows, scale them for production use. Consider factors like performance, scalability, and security.
Hi @dazzerd
Submit a support ticket to snowflake, they will resolve it asap.
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook