Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Please share your opinions on this subject.
I successfully connected to my remote MSSQL Server and can import those tables necessary to generate reports.
When I import tables, I can hide/delete columns, filter records and more.
However, I can also write the corresponding query in SQL Studio and that will return only the records I need, with fields I need.
Importing tables will probably allow me to generate different reports. Writing the SQL query is much faster for me.
Importing query results is much faster than importing several large tables.
Both options have benefits and drawbacks. What would you reccomend? and why?
Solved! Go to Solution.
Using direct query can help if your table is too large to import. From what I can tell trying to import using params in power bi is pretty difficult. I tend to create a sql view with ctes to act as params and use another cte to bring it altogether in view to only return and smaller data subset and then import in with power bi.
The import is fairly quick for tables under 30 million rows.
Using direct query can help if your table is too large to import. From what I can tell trying to import using params in power bi is pretty difficult. I tend to create a sql view with ctes to act as params and use another cte to bring it altogether in view to only return and smaller data subset and then import in with power bi.
The import is fairly quick for tables under 30 million rows.
@Mark_Timson wrote:Using direct query can help if your table is too large to import. From what I can tell trying to import using params in power bi is pretty difficult. I tend to create a sql view with ctes to act as params and use another cte to bring it altogether in view to only return and smaller data subset and then import in with power bi.
The import is fairly quick for tables under 30 million rows.
That is exactly where I stand.
Yes, the tables I need to import are large in terms of both fields and records. So, I have come to the same solution, where I use CTEs to select those fields and records I need. Then, use Power BI filters or measures to apply further selection. But the main data are extracted using a SQL Query.
Thank you @Mark_Timson
Its better when you have a date range you need to get data for but, In my case I had a very large table over 200 million rows, I couldnt load all of this data into a pbix file so I filtered a range of the data down using 2 ctes and a final select and put it all into a view like the below:
CREATE VIEW [dbo].[Example] WITH [StartDate] AS ( SELECT [date] AS [FirstDate] FROM [dbo].[lookup_date] ld WHERE CAST([ld].[date] AS DATE) = CAST(DATEADD(Year, -2 ,GETDATE()) AS DATE) -- Get 2 years worth of data ) ,[EndDate] AS ( SELECT [date] AS [LastDate] FROM [dbo].[lookup_date] ld WHERE CAST([ld].[date] AS DATE) = CAST(GETDATE() AS DATE) -- Get to today ) , [stk_retail] AS ( SELECT [week] , [item_code] , alloc_wh_stock , avail_wh_stock , branch_stock , wip_in_transit FROM [dbo].[retail] WHERE [week] >= ( SELECT [FirstDate] FROM StartDate ) AND [week] <= ( SELECT [LastDate] FROM EndDate ) ) SELECT [week] , [item_code] , [alloc_wh_stock] , [avail_wh_stock] , [branch_stock] , [wip_in_transit] FROM [stk_retail] GO
This does 2 things gets only the columns I need making my dataset and model much smaller and only get the rows I need to import into the model making it easier to load into power bi.
For models where you need very large amounts of data creating a SSAS tabular model and partitioning it would probably be the way to go.
HI @EVEAdmin ,
Actually, power bi sql connector also support use t-sql query to choose specific fields. You can find it in advanced option, sql statement:
Quickstart: Visualize data using the Azure Data Explorer connector for Power BI
Regards,
Xiaoxin Sheng
@v-shex-msft wrote:HI @EVEAdmin ,
Actually, power bi sql connector also support use t-sql query to choose specific fields. You can find it in advanced option, sql statement:
Quickstart: Visualize data using the Azure Data Explorer connector for Power BI
Regards,
Xiaoxin Sheng
@v-shex-msft many thanks, appreciated.
I am aware of that option but I must say I never used it to filter records when I import a table.
Say I am importing 4 tables, shall I just enter 4 select statements, separated by ; to get all records as needed?
Alternately, I can just write the complete T-SQL query, including JOIN etc, and import the exact records and fields. Then user Power BI only to generate reports and skip the modeling process.
Hi @EVEAdmin ,
Actually, you need to write four queries for these query tables.(it will change navigation steps and direct to use query to choose which tables and fields imported to power bi.)
let tquery="SELECT * FROM TABlE1",//sample Source = Sql.Database(Server, Database, [Query=tquery]), Othersteps=xxxxxx in Othersteps
Regards,
XIaoxin Sheng
@v-shex-msft wrote:Hi @EVEAdmin ,
Actually, you need to write four queries for these query tables.(it will change navigation steps and direct to use query to choose which tables and fields imported to power bi.)
let tquery="SELECT * FROM TABlE1",//sample Source = Sql.Database(Server, Database, [Query=tquery]), Othersteps=xxxxxx in OtherstepsRegards,
XIaoxin Sheng
Got it, thank you. One query for each set of data I wish to import to Power BI.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
112 | |
100 | |
80 | |
64 | |
57 |
User | Count |
---|---|
146 | |
110 | |
93 | |
84 | |
67 |