Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
austinsense
Impactful Individual
Impactful Individual

Difficult Question: Dynamic Time Zones with Power BI Embedded, Direct Query, and RLS

I've been turning this thing over in my mind for the past week. Here's the challenge, and I'll state it from the perspective of a report user ...

 

"As a user I need to be able to see my data, on a report, in the time zone of my choosing - for example if I set my time zone preference to 'UTC -5' I expect data to be shown as if it were being reported from that time zone."

 

A few things you need to know.

  • We have an IoT product that collects sensor data and surfaces that data in a web application, app.conserv.io. We use Power BI Embedded as our analytics engine in the application.
  • Our application data lives in Postgres on AWS. Because we want our users to experience near real-time data on the application, we need to have a model that uses Direct Query, and we need a data source that support Direct Query, so we migrate our data to AWS Redshift every minute using Stitch. We've got a few performance kinks to work out, but this is working great. Note: I'm not interested here in debating the merits of AWS/Azure unless it's directly relevant to solving this issue. We choose AWS for lots of reasons and I'm happy to discuss it elsewhere.
  • Our application uses Power BI Embedded with Row Level Security, so when a user views a report in our application they can only see the sensors that they are assigned to see. This part is working great.
  • Our sensor readings are recorded on a 'Readings' table that has a 'DateTime' column that records the time the reading was taken. All readings are recorded at "UTC +00", and each user has a default time zone preference set in the application. The 'Users' table has a 'timezoneoffset' column that's an integer between -12 and 12.
  • Not relevant to this problem, but just to help you understand the data model a bit better, there's a 'Sensor' table with all of our devices and 'UserSensor' table that has the security relationships between 'User' and 'Sensor'. This is very simple model. All tables are set to 'Direct Query' mode.

Things we've tried so far.

  • FIRST. I started as I usually do by separating the Date and Time columns on the 'Readings' table and creating 'Date' and 'Time' tables. The 'Time' table had one row for every minute in the day. I setup a 'TimePeriod' table that does a few things. 
    • "Reading Granularity" I want the user to be able to choose whether they want to see readings aggregated from every 15, 30, 60 minutes. This works great. Happy to walk you through how I did this.
    • "Time Zone" I want the user to be able to choose a time zone and see the right one displayed on the report. I basically setup 24 versions of reality, corresponding to each time zone, so the user can pick which version of reality they want to see. This doesn't work. If I have a reading from 2am and I move that reading to 5 hours before, then I have a time that shows 10pm. That works great but I haven't accounted for the change in the day.
  • SECOND. I created a calculated column on the readings table using DAX that finds the preference for the user and offsets the datetime accordingly.
    • Not ideal, because I'm planning on splitting this column into date and time and then relating both of these columns to separate date and time tables.
    • But this concern is moot because I can't create a calculated column that references another table when I'm using Direct Query.
  • THIRD. I created a PQ parameter -12 to 12 that is passed into the reading table at query time, and then we pass the user's preference into the query when the Power BI Embedded service renders the report.
    • This doesn't work because passing a parameter happens at the dataset level so two users, with different time zone preferences, looking at a report at the same time will see their reports switching back and forth between time zones.

Other *specific* thoughts.

  • CRAZY IDEA. What if i create datetime table that has every datetime, at the minute level, over the course of 25 years and then duplicate that data for every time zone, thereby creating 24 versions of reality for every minute over a 25 year period.
    • 300M+ rows, extremely high cardinality. I'm not going to do this, but I did at least think about it
  • SQL RENDER.
    • A date table that is created dynamically when the report renders that links a datetimeid on the readings table with the proper date and time on their respective tables.
    • Or we create a calculated column in a view of the 'Readings' table that creates timezone adjusted datetime before passing the readings data over to Power BI.

Other *random* thoughts.

  • I'm very wedded to the idea of having a date table and time table, but maybe I should let go of that assumption. I'm not saying this solves anything specifically, but the thought keeps occurring to me.
  • If we're developing a graph in React, we just pass it the datetime array and then offset the array right at the moment the report renders. Is there a way to do this in Power BI?

Here seems to be the rub. To have a set of dates and times to show on a report, you must have those dates and times in a table in the model. In other words they can't just be magically rendered on the spot. At the same time what I need is a layer that is rendered at "run time" based on the preference of the report user.

 

Interested to hear the community's thoughts.

 

PS I can't seem to change my default signature which says I'm VP of Operations at PowerPivotPro. That's not up-to-date. I'm now the CEO at Conserv, a startup focused on sensors and software to better preserve art and cultural collections.

Austin is VP Operations at PowerPivotPro and a professional self-service enthusiast 🙂
11 REPLIES 11
EnglishCJ
Frequent Visitor

 To have a set of dates and times to show on a report, you must have those dates and times in a table in the model. In other words they can't just be magically rendered on the spot. At the same time what I need is a layer that is rendered at "run time" based on the preference of the report user.

Why must you have those [timezone-adjusted] datetimes in a table in the model [in Power Query, yes?] Why can't a DAX measure and calculated table be used such as:    ???

// Make a table for a slicer of timezone offsets:
'USER_REPORT_SELECTOR_CALCULATED_TABLE' =	GENERATESERIES(-12, 12, 1)

// Make a measure that combines the data and the user selected offsets or default offset
[USER_ADJUSTED_TZ] :=

VAR WHATEVER_IS_YOUR_DATA =
	CALCULATE (
		'IoT_product'[timepoint],
		FILTER ( ALLSELECTED ( 'IoT_product'[timepoint] ), NOT(ISBLANK('IoT_product'[timepoint])))
	)

VAR WHATEVER_IS_YOUR_USER_DEFAULT_PREFERRENCE =
	SELECTEDVALUE('PQ_USER_RLS'[default_tz])

VAR WHATEVER_IS_YOUR_USER_SELECTED_PREFERRENCE = 
	SELECTEDVALUE('USER_REPORT_SELECTOR_CALCULATED_TABLE'[USER_REPORT_SELECTOR_CALCULATED_TABLE], WHATEVER_IS_YOUR_USER_DEFAULT_PREFERRENCE)

RETURN
	WHATEVER_IS_YOUR_DATA + WHATEVER_IS_YOUR_USER_SELECTED_PREFERRENCE

 

The above DAX should have 

RETURN
IF(ISBLANK(WHATEVER_IS_YOUR_DATA),"N/A", WHATEVER_IS_YOUR_DATA + WHATEVER_IS_YOUR_USER_SELECTED_PREFERRENCE)

But the question is basically why not have a DAX measure of 'your data'[time] + [offset] ?

dince24
New Member

Hi, curios to hear how you solved for this at the end?

 

JoaquinRuisenor
Regular Visitor

Hey Avi, your answer was the closest solution I have found to cross time zone reporting. Your pbix file can be downloaded in the link you provided, however, the pbix file takes data form an excel file which I wasnt able to download.  Is it possible to get this data from your drive? it would help me a lot. 

 

An alternative can be to explaining or giving a sample of what data that excel contained so we can make sense of the M language within the pbix file.

 

Thank you.

Austin,

 

Here is what I got: Download Files

(Huh...I was hoping the forum would have an option to upload file. But I didn't find it. Above is a link to my OneDrive).

 

Below is a quick video with a bit more info about the solution.

Anonymous
Not applicable

Avi,

 

Is the sample file from your video still available for this issue?  I have the exact same requirement for a solution we are building where we need to present Contact Center data across timezones.

 

I posted a similiar question to the community, but just came across your post today.

 

Thanks,

 

Jeremy

Here’s my idea. 
 
Create a dim date table at the hour level of granularity (24 rows per day with the key on date/hour utc). Store the minutes (and seconds) in the fact table only. Create a second fact table that mirrors the date table key column, with a column that contains one row for each time zone needed per UTC time.  Something like
 
Table:DateAdjust
 
DateHourUTC
TimeZoneID
adjustedDateTime
 
Eg 
2 Feb 2019 0100, -2, 1 Feb 2019 2300
 
so if you have 7 time zones to cater for, your fact table will be 7x larger than the date table. 
 
Turn on bidirectional filtering to the date table. 
 
Join the timezoneID column to a user security fact table with the USERNAME and timezone. 
 
User security fact table filters the DateAdjust table so they can only see their own timezone columns. This table filters the date table in UTC, which in turn filters your fact table 


* Matt is an 8 times Microsoft MVP (Power BI) and author of the Power BI Book Supercharge Power BI.
aabbey
Frequent Visitor

I see two major pain points:

 

1 - You need to do the time zone correction before you split the date and time up.

The tz correction needs to be done in a calculated column (that is dynamic to the user), then the calculated column can be split into date and time portions just by repeating the measure and coverting to date or time. Those columns can then be joined to the separate date and time tables.

 

2 - Direct query is restrictive

You cannot use the offset related to the user in the user table, and you can't use an independent offset chosen via a "What If" parameter set up.

 

So that means getting the correct time zones into the reading table, as a column, in a way that allows two users in different time zones to view the same sensor id (rls filter) at the same time, and get a personalized view.

 

One way to do that is to duplicate the readings record set for each user preference time zone.

This will explode your data, of course, but with the restrictions on direct query, I'm not sure a compact solution is available. Hopefully I am wrong!

 

Example:

If you have sensor A, with 4 users with unique time zone preferences, you would need four sets of readings unioned together; with one of the potential time zones added to a "TZOffset" column for each set.

 

The RLS can now handle the sensor ID filter and the time zone filter. This allows two users of the same sensor with different time zone needs to view the same data with a personalized view.

And, with the offset number in the Readings table, you can do the calculated column to get a corrected datetime fairly easily. From there, you can go back to #1 and also have your separate date and time tables.

 

If I come up with anything else I will let you know.

Juicy problem though, thanks for bringing it to the community!

 

 

 

 

 

 

Anonymous
Not applicable

Slightly less crazy idea

Create a minute only table with 24 columns, 1 per TZ offset and a column indicating which one is the first on or after midnight.

You should be able to reverse engineer from a user TZ to the fact table offset and also generate the local TZ display value from that?

How does this fix the date issue?

Austin is VP Operations at PowerPivotPro and a professional self-service enthusiast 🙂
Anonymous
Not applicable

I'll try a demo tomorrow.

Also to your React question: PowerBI hands your custom visual a dataset which you can then manipulate as you like.

So ... essentially..if you can generate the table to correctly "model" the offset calculation, you'd already have the problem solved.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.