Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Salamalhasan

Unveiling the Power of Lakehouse's SQL Analytics Endpoint for Fabric Data Warehouse

The SQL analytics endpoint lets you easily query data in your Lakehouse using T-SQL and TDS protocol. Each workspace can have multiple Lakehouses, each with its SQL analytics endpoint. Users don't need to create one manually; it's automatically generated for every Lakehouse. Behind the scenes, it uses the Warehouse's high-performance engine, ensuring fast SQL queries. The Lakehouse, with its SQL analytics endpoint, is a powerful tool for flexible and essential data exploration and analysis.  

 

  

In a medallion architecture for lake data, files are organized into raw (bronze), consolidated (silver), and refined (gold) layers. If data is stored in Delta Lake format, SQL analytics endpoint can analyze the gold layer, even outside Microsoft Fabric OneLake.

 

In Microsoft Fabric Lakehouse, data is physically stored in OneLake with two main folders:

 

The /Files folder has raw (bronze) files needing processing by data engineers.

The /Tables folder contains refined and consolidated (gold) data in Delta Lake format, ready for analysis.

SQL analytics endpoint can read data from the /Tables folder in OneLake, simplifying analysis through queries. With the Warehouse, you can perform cross-database queries and smoothly transition from read-only to building business logic on your OneLake data using Synapse Data Warehouse. 

 

Scenarios the Lakehouse enables for data warehousing

Let us begin by initializing the data ingestion process into the lakehouse, examining how this data can be efficiently utilized by the data warehouse.

 

Architecture

The following image shows the lakehouse end-to-end architecture:

Salamalhasan_1-1710862798202.png

 

Data used in this article is from :  https://raw.githubusercontent.com/MicrosoftLearning/dp-data/main/sales.csv  For the lakehouse end-to...

Ingest Data

To effectively utilize your data warehouse, you must start by bringing data into it. In this context, we will demonstrate how to do this by ingesting data into the lakehouse using  Copy and Zero-Copy activity.

The first tool I'll use is the Copy Data Assistant to bring data into the lakehouse. 

Salamalhasan_2-1710862821456.png

 

 Select source to be Generic Protocol --> HTTP Then Next

Salamalhasan_3-1710862839166.png

As this data is new, we need to create a new connection and name it as SalesConnectionData.

Salamalhasan_4-1710862924581.png

 

Follow the wizard and keep the default values as its and then Save & Run. 

Go back to the lakehouse, and there you have it! A new table has been created, and this table is based on a Parquet File. Voila!

Salamalhasan_5-1710862950725.png

 

Data and transformation flow

 
After bringing data into the lakehouse, let's check out how it looks in the data warehouse. Check the image for source, destination, and data transformation details.

 

 Salamalhasan_5-1710861341177.png

 

The data is now accessible to third-party reporting tools through the TDS/SQL analytics endpoint. This endpoint enables you to establish a connection with the warehouse and execute SQL queries for analytical purposes.

 

In the example below, observe how the data is presented within the warehouse in a tabular structure.

 Salamalhasan_0-1710862635961.png

 

 

Furthermore, SQL analytics endpoint enables direct data analysis within the Warehouse or Lakehouse without the need to transfer it, even across distinct Microsoft Fabric workspaces. As mentioned earlier, data in Microsoft Fabric Lakehouses is stored in OneLake. Consequently, Shortcuts simplify the process of referencing folders in any OneLake location.

 

Salamalhasan_6-1710863056638.png

 


  

I have a Data Warehouse, so I will utilize it in this example.

 

Salamalhasan_7-1710863072092.png

 


  

 

 You can select the tables you want to reference in the lakehouse.

 

Salamalhasan_10-1710863345469.png

 

 

 After finishing, take a look at the lakehouse. You'll see the tables are there.

Salamalhasan_9-1710863129490.png

 

Conclusion

In this article, we looked at different ways to bring in data, like using the Copy Data tool, shortcuts, and soon we'll talk about notebooks. If you're diving into data warehousing, Lakehouse's SQL analytics endpoint has some cool features that give you important info and practical tips. Whether you're a tech fan or a pro, this guide is super helpful for making the most of your data work. The tools we covered here are like building blocks, helping you get a strong grasp on data warehousing basics and setting the scene for more info in upcoming articles.

References:

What is data warehousing in Microsoft Fabric? - Microsoft Fabric | Microsoft Learn

Lakehouse end-to-end scenario: overview and architecture - Microsoft Fabric | Microsoft Learn

Better together - the lakehouse and warehouse - Microsoft Fabric | Microsoft Learn

Referencing data to a Lakehouse using shortcuts - Microsoft Fabric | Microsoft Learn

fabric-docs/docs/data-factory/how-to-ingest-data-into-fabric-from-azure-data-factory.md at main · Mi...

Wide World Importers - Sample database for SQL - SQL Server | Microsoft Learn