Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Anonymous
Not applicable

Peak Renders analytics

Hi everyone,

 

We have just (finally) moved from the old Workspace Collections to the new Embedded service and onto the A1 SKU. Our dashboard connects to an Azure SQL Database. So far so good, although we've hit the 3GB RAM limit on the refresh. That's another story!

 

What I'd like to know is how can I monitor 'peak renders per hour' so I can tell how close we are getting to 300 renders per hour.

 

The out of the box 'Usage Metrics' in Power BI are too basic - just reporting usage views per day.

 

The Azure metrics don't seem to have this although I don't yet understand whether QPU High Utilization, Query Duration, etc will help.

 

I'm really hoping I've missed something obvious! Thanks in advance!

9 REPLIES 9
Anonymous
Not applicable

I would like to know, how do you charge your clients for the cloud computing? or embedding part of it?

I found it hard and its too expensive

Any advice would be appreciated

Anonymous
Not applicable

Hi @Anonymous 

 

Charge for subscription service -- then provide lots of updates during the year so they feel the value! Smiley Happy 

 

We use an Azure SQL database at about £120/month plus PowerBI Premium A1 SKU at £550/month plus a little cost for web hosting. Our costs amount to about 25-30% of the income, then the cost of labour is the biggest part (admin, development) and then something for profit. You need enough clients to get going - that's the hardest part.

 

Alternatively, your clients could have Power BI Pro and you could take an account in their tenant and populate it from there. But more messy and doesn't seem as professional to me, but a good way to start and avoids the hefty fixed cost of A1 SKU.

 

With the old Power BI Embedded it was a lot cheaper and you paid for usage. Premium is a much better service but still aimed towards large corporates rather than small ISVs.

 

You can run quite a lot of analytics in A1 especially if you optimise it as much as possible (a good trick is to pause/unpause the capacity each morning which clears the cache).

 

It does work very well though - no downtime since it began and pretty quick query responses. 

Anonymous
Not applicable

Hello,

 

Are there any updates? Is it now possible to monitor peak renders per hour in PBI Embedded (Azure)? Thank you in advance.

v-micsh-msft
Employee
Employee

Hi,

 

"

How can I monitor capacity consumption?

Monitoring through Azure is on the near-term roadmap. The Azure resource, Power BI Embedded, will include monitoring KPIs that will show health, and usage.

"

So the monitor part should be not available at the current time.

 

More reference, check:

Power BI Embedded FAQ

 

Another thread talking about the Peak Hours for your reference.

 

Regards,

Michael

Anonymous
Not applicable

Hi @v-micsh-msft

 

Thanks for getting back to me. It's good to know some Azure analytics on Embedded is on the near term road map. If the following could be passed to the team for consideration:

 

1. How do we manage the 3GB RAM usage when refreshing an import pbix for A1 tier?

2. How can we monitor the 300 peak renders per hour?

3. When using the Master account option, how can we identify which users are using the service the most (in terms of renders)? This enables us to identify which customers are using it more heavily and thus help us price our service more precisely.

 

We want to get value out of the tiers and not buy unnecessary capacity, and these two processes are key to efficient use of resources. If not, we would end up buying more capacity, passing that cost onto our customers, making our service less attractive and thus ending up with less customers (and in the long term then less capacity!).

 

Thanks for the other thread on Peak Hours - I had seen it, and it goes to show that there is little information on this yet. The most consistent information I have seen so far was that White Paper published last year which gave some indications of how the peak renders work at the different tiers.

 

It would be fabulous if you could provide the real-time data for us and we can then use Power BI to analyze our data 

 

I shall keep watching the Power BI Blog for further updates. In the meantime, we will have to find a way to regularly assess whether our capacity is being throttled back.

 

Thanks.

ThomasDay
Impactful Individual
Impactful Individual

We've been battling the Embedding pricing tiers and find the QPU limits hugely restrictive and that's after going to the A2 and A2 price tiers (18K and 32K per year) for a site that maxes out at 2 and occasionally 3 concurrent users.  We find both A2, A3, and even A4 (72K) fail under modest load.  This is even if the space has been idle and a handful of users access the site per day.  Anyone else having this issue?

 

In addition, the Price tiers and NOT scalable in any workable way--where if you are approaching the QPU capacity, for example, there is no way to up the pricing tier to service the load until things settle down.  Hence a short spike at any time can cause our site to fail.  As it is currently configured, one manually sets the pricing tier and switching from A3 to A4 can take 3-5 minutes during which time, the site is not serving up anything!

 

As an aside, we are a BI tool that creates context/comparability across multiple data sets--"find all hospitals that have a positive EBITDA, 30% Medicare patient mix, 60%+ occupancy, a 1.25 CMI, and low readmit rate...and let's see theh groups profitability, sasfety performance, market share trends, etc.  So we are by design a query heavy application, but this seem like main line Power BI stuff and the pricing seems to strip the power out of Power BI in this example after 2+ years of sailing along great.

 

Anyone else having these issue?

Thanks--Tom

Anonymous
Not applicable

Hi @ThomasDay

 

I feel your pain - especially when the spikes cause it to fail. For a couple of concurrent users, it seems odd that you need that capacity. I run a dashboard 60MB for around 300 users across 120 organisations. Not sure about concurrency but probably 5-10 during peak periods. We peak at around 2.3GB and base memory load is about 1.9GB. Of course, every use case is different, so it'd be interesting if you could share some more info.

 

How tall are your memory spikes? If they are very tall, that probably means a lot of Power Query work is going on. 

 

Some initial suggestions:

 

1. Maximise query folding.

2. Reduce the 'peaks' on refreshes by getting rid of Power Query code. Do as much in DAX or SQL as you can.

3. Use SQL views.

4. Don't bring in rows, columns and/or tables that aren't used.

5. Use the Azure metrics to monitor Max Memory Usage.

6. Split the PBIX files up - advice that a MS employee gave me. Haven't taken it up yet, as it seems a bit drastic.

 

If you want to post some more details, maybe we can explore solutions?

 

Kind regards

 

Angus

ThomasDay
Impactful Individual
Impactful Individual

@Anonymous Hello Angus,

 

To be candid, the issue is not that my models are inefficient.  The issue is that Premium capacity is RESERVED capacity whether we use it or not.  We do not use a HUGE portion of this reserved capacity as a startup and with a product that gets heavy use episodically but unpredictably.  Cloud based solutions are appealing to startups because they typically (and previously with Power BI) charge for CONSUMPTION.  So it's a pay as you grow.

 

I could make my models more efficient and still be paying an outrageous amount looking at consumption 90% idle...perhaps I could move down a pricing tier...but it's the pricing model that is inefficient.

 

If you're a big, worldwide company--lots of users, 24 hours usage, dashboard style models, many people given access---fine.  But this pricing model is blind to the needs of startups with different model types and different usage profiles.

 

Tom

 

 

Anonymous
Not applicable

Hi @ThomasDay

 

Agree. When they first released the new version of Power BI Embedded, they didn't have the A* tiers, just P1 as a starter. We fought hard for them to put in lower level tiers for start-ups. We had a nervous six months waiting for a response from them whilst the old service was in the process of being deprecated.

 

It is in no one's interests other than Microsoft engineers and marketeers to have a tiered capacity system. As you say, customers like to pay for what they use. It must be very annoying to have a model that has high peak usage but only occasionally. That feels like a huge waste of money.

 

For us, we are constantly checking to see if the Maximum Memory Usage gets near the ceiling - otherwise it just falls over and doesn't process - and that has a big reputational impact for us. 

 

We are also having to work on various ways to expand our offering to our customers without increasing the tier. This is such a waste of developer time. We want a fair, consistent price that matches to our customers' usage (and their perceived value). This pricing model doesn't.

 

The irony for us is that we use Microsoft Azure database for our backend data storage and that does charge for usage plus a little rental. So it's not as if they can't do this.

 

I hope that someone from Microsoft is listening to the start-ups.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.