Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
gatapia
Frequent Visitor

Large dataset options for reporting server

I'm looking at building a large report (>100M rows) and I can't seem to find a good workign strategy on report server.  On premium this is easy using deployment pipelines, or incremental refresh, etc.  But what options are there for report server? I.e. how do I have a small working dataset locally (desktop) and the full dataset on the report server?

 

 

4 REPLIES 4
gatapia
Frequent Visitor

I managed to find a slightly better workaround (still terrible though) which worked in my scenario.  I am using explicit SQL queries to get my data, i.e. not just pointing to a table, but actual select .... from ....  I do the subset for development using a where clause.  

What I needed to do in PROD was to ignore the where clause filtering the data.  So what I did was created a service account to refresh the data on report server, say account was domain\svc_data_refresh.  So then I just changed my query to:

 

select ... from table
where ...
and ((select CURRENT_USER) = 'domain\svc_data_refresh' 
  OR /* filter clause */ date_created > '2021-01-01')


This ignored the filter clause if the user is the service account.  This way when I refresh data in PBI desktop (using my account) it filters data.  The server using the service account refreshes the full dataset.

gatapia
Frequent Visitor

Wow, what a terrible workaround.   But thanks for the confirmation.


@gatapia wrote:

Wow, what a terrible workaround.   But thanks for the confirmation.


It depends on what your use case is. In our scenario some of our models have 20+ reports linked to them using a live connection and we can then update a measures in one place and any fixes/improvements flow through to all those reports. And adding a page or changing a property on visual on a report does not require re-processing the entire dataset - it can be deployed instantly.

d_gosbell
Super User
Super User

There is not really a good solution for this on Report Server alone. What we do with models of this size is to build them as SSAS tabular models. Then we can use deployment tools and partitions for incremental processing and we just create pbix reports in live connect mode to SSAS. If you have SSAS on a separate server (which is a good idea to spread the load) you just need to ensure you have Kerberos setup.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.