cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Magistralis
Frequent Visitor

Rework on current PBI data setup - tips and tricks needed

Good day beautiful people,

 

I need your help and some guidance to rework how currently we are working with PBI.

 

To give you some background, this is how we currently do stuff.

 

Data flow:

  1. Download data from ATS or related
  2. Upload it to SharePoint
  3. Database is connected to a folder and it is uploading all of extracts to DB (1 extract = 1 table)
  4. PBI is connected to the DB and we download all of these tables (import method).

We cannot change the way how we get data from ATS to DB (via APIs etc.)

Also there is NO SINGLE modification (merge, delete, etc) done on DB level, we download data exactly as they were uploaded.

 

Tables are not related to each other in any way, they contain some columns which can be found in every extract (e.g., User ID) but in some extracts we have more, in some less.

All the operations (table / column merges, calculations are made in PQ and / or in DAX). Each table contains about 500, 600k rows (biggest 800k, lowest 300k).


You can already imagine that this is SLOW process. Changing anything in PQ, will take about 45 mins from my life.

 

Dashboards

Currently we have about 12 dashboards and everything is connected via live connection to our main report, so if we would like to change anything, we must change Main_Dashboard to see changes in other visualizations

 

I'm already on the edge so I decided to do something with it. Right now I'm trying to convince management to this:

  1. Move merges, calculations from PQ to DB (90% of merges are just simple merge to get one column from another table)
  2. Stop using single dataset for multiple dashboards
  3. Add indexes to our DB tables.

But I feel like I'm missing something, and here is where I need your help. What else should I suggest to improve overall experience with PBI? I'm not advanced user so I probably miss some basic steps / solutions.

If not using single dataset, then what? And so on.

 

I would love to learn more about it but for now I failed to find anything useful in web.

 

Thank you in advance!

1 REPLY 1
otravers
Super User
Super User

Your plan is the way to go as transformations in the database will perform better than in PQ. To make the remaining PQ development bearable, you could move any ETL not already done in the database out of the dataset and into Power BI dataflows.

------------------------------------------------
1. How to get your question answered quickly - good questions get good answers!
2. Learning how to fish > being spoon-fed without active thinking.
3. Please accept as a solution posts that resolve your questions.
------------------------------------------------
More tips and guidance in my Power BI architecture and development blog

Helpful resources

Announcements
November 2022 Update

Check it Out!

Click here to read more about the November 2022 updates!

Microsoft 365 Conference â__ December 6-8, 2022

Microsoft 365 Conference - 06-08 December

Join us in Las Vegas to experience community, incredible learning opportunities, and connections that will help grow skills, know-how, and more.

Power BI Dev Camp Session 27

Ted's Dev Camp

This session walks through creating a new Azure AD B2C tenant and configuring it with user flows and custom policies.

Top Solution Authors
Top Kudoed Authors