Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
zudar
Post Patron
Post Patron

Question about scheduled refresh

Hi all,

 

I have a quick question about the scheduled refresh of reports in 'import mode' deployed to a Report Server. FYI, we have obtained Power BI Report Server as part of a purchase of SQL Server Enterprise with active Software Assurance (SA). We retrieve our data from a single SQL Server database.

 

Now, I'm questioning my decision to create reports in 'import mode' because of limitations in the scheduled data refresh. I'm confused by the following:

  • Power BI Premium has a scheduled refresh limit of 48 times a day
  • I've seen examples where people are able to set the scheduled refresh at a minimum of 1 minute (15 minutes in the screenshot below):

Capture13.PNG

 

Now what situation am I in? I would really like to be in the second situation. 🙂

 

In any case, 48 times a day would be a questionable frequency since my end-users prefer a report that's updated once a minute or every few minutes. I assume that would require me to create reports in DirectQuery mode (or Live?).

Advice is very welcome. 🙂

1 ACCEPTED SOLUTION
d_gosbell
Super User
Super User

If your users absolutely cannot have the data be older than 1 minute then you should build your report using Direct Query against a database. Note that you will need to make sure that your database has appropriate resources and is appropriately tuned to handle the additional query load, but this sort of low latency requirement is one of the use cases that Direct Query was designed to address.

 

A live connection probably would not help as you would still need to load the data into a SSAS Tabular model, but it might be an option if you could make use of an incremental processing stategy in order to reduce your processing time. (since PBIRS currently does not support any form of incremental processing)

 

As a rough rule of thumb once you get under 10-15 minutes for a refresh you are getting into the range where the cost of constantly loading and compressing the data could outway the benefits. If you have large data volumes or a high number of user queries you may need to push these times out, or if you have small datasets you might be able to get away with a smaller window.

 

You may even want to consider 2 models, one that is larger and more complex and has more historical data but is updated less frequently and a second that is in Direct Query mode that may be simpler and only have access to more recent data (to keep the queries small and fast).  It depends on your requirements, I just wanted to point out that you may not have to solve all your reporting needs in a single data model.

View solution in original post

4 REPLIES 4
d_gosbell
Super User
Super User

@SYM1986 I don't believe you can get Power Automate to trigger actions on an on premise Report Server. I believe this only works for the Power BI cloud service

SYM1986
Helper I
Helper I

Hi @zudar ,i thinks you can use PowerAutomate to do this.

1.JPG2.JPG

d_gosbell
Super User
Super User

If your users absolutely cannot have the data be older than 1 minute then you should build your report using Direct Query against a database. Note that you will need to make sure that your database has appropriate resources and is appropriately tuned to handle the additional query load, but this sort of low latency requirement is one of the use cases that Direct Query was designed to address.

 

A live connection probably would not help as you would still need to load the data into a SSAS Tabular model, but it might be an option if you could make use of an incremental processing stategy in order to reduce your processing time. (since PBIRS currently does not support any form of incremental processing)

 

As a rough rule of thumb once you get under 10-15 minutes for a refresh you are getting into the range where the cost of constantly loading and compressing the data could outway the benefits. If you have large data volumes or a high number of user queries you may need to push these times out, or if you have small datasets you might be able to get away with a smaller window.

 

You may even want to consider 2 models, one that is larger and more complex and has more historical data but is updated less frequently and a second that is in Direct Query mode that may be simpler and only have access to more recent data (to keep the queries small and fast).  It depends on your requirements, I just wanted to point out that you may not have to solve all your reporting needs in a single data model.

@d_gosbell Thanks for your elaborate answer! I will take it all into consideration.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.