cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
FlyKick
Helper II
Helper II

Best Practice For Displaying Date/Times in Local Time Zone

Hi Brains Trust,

 

I have a data model that has 6 tables each with at minimum 3 date/time columns all from an API endpoint in UTC.

 

Here is an example of one of the tables

FlyKick_0-1632637640397.png

 

Here is the power query for the sample table above. 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bc49DsIwDIbhq1Se28o/cRJ75AQMmYgyUTYEEqL3JyAGiro+sr7XtQKxBI0wAiPThDYxFxYncQ6nX5aC6NI5bjgUJKfk8rk+Pu7Len4Oh/W2XC8DddKcZkNoYwXLKWrYjr5b6pj/R/da+v2MaKfFncxsVoXWXg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [TransID = _t, #"Date Entered (UTC)" = _t, #"Date Picked (UTC)" = _t, #"Date Shipped (UTC)" = _t, Description = _t, #"Order Value" = _t])
in
    Source

 

 

I have done a ton of reading up on this issue and my understand is to show these date/times in reports within the Power BI service in the end users local time zone I need to perform the following steps. 

 

1. Check If Daylight Savings Is Active and Get The Time Zone Offset

James Fancke from Self Service BI has done the heavy lifting already and kindly provided the power query code to achieve this.

Converting from UTC to Local Timezone in Power BI and Azure Analysis Services – SelfServedBI.com.

 

I added two custom columns to his query 

  • AEST Offset = DateTimeZone.ZoneHours([Last Refresh Date Time])
  • InDST = if [AEST Offset] = 10 then false else true

Here is the resulting table. 

FlyKick_1-1632638318791.png

 

Question 1 - I am planning to reference the AEST Offset column to convert the UTC time columns to local time using this code. 

Date Entered (Local) = DateTimeZone.SwitchZone([#"Date Entered (UTC)"],List.Min(#"Date-AEST"[#"AEST Offset"]))

 

Is this best practice? Should I be using a parameter instead? I also read conflicting recommendations that this should be done in DAX not Power Query but then I read doing things in Power Query is more effecient? Am I on the right track here? 

 

 

2. Create Custom Columns for each UTC Date/Time Column Adding the Offset Hrs

For context I am referring to these columns 

FlyKick_2-1632639031987.png

 

I am using the following power query code to create the custom columns and convert them all to our local time. 

  • DateTimeZone.SwitchZone([#"Date Entered (UTC)"],List.Min(#"Date-AEST"[#"AEST Offset"]))

Here is the resulting table, as you can see achieves the desired result, for each of the UTC date/time columns I now have the  same time in local date/time. 

FlyKick_3-1632642282726.png

 

I have numerous questions here but will limit them to these three crucial ones

 

Question 1 - This just doesn't seem effecient at all. First we are duplicating every date/time column and if we wanted this report to support multiple time zones we would have to create a column for each which boggles my mind. Luckily I only have to deal with the single time zone for now but I am curious if this really is best practice? Maybe this is why they recommend doing date/time transformations in DAX and using measures? 🤷‍♂️

 

Question 2 - We need to do quite a bit of relative date filtering. Things like this week vs last week, this month vs last month etc. I understand I am going to need a date dimension table to store information about each date like is it a weekend, what ISO week, month quarter number is it etc etc .

 

Again James Francke has done most of the heavy lifting for us with a great article on this which you can read here 

Creating a date table in Power Query – SelfServedBI.com

 

I presume I would split only the local date/time columns namely;

  • Date Entered (Local)
  • Date Picked (Local)
  • Date Shipped (Local) 

Is their a recommended way to split these columns? Again it seems I am duplicating the columns and changing their type one to date and one to time? Further can I join all three columns to the date dimension table? Presumably I will need to so that I can relative date filter on each of them? I.e. how many orders did we enter this week vs last? How many did we ship last week vs this week? 

 

Again it boggles my mind how you would handle this if you had to support multiple time zones? 🤷‍♂️🤷‍♂️🤷‍♂️ So I feel I am doing it the wrong way.... 

 

Question 3 - Can anyone point me to some guides on how to calculate the business hours between two dates? I need to count hours between 9am and 5pm in our local time zone on weekdays Monday-Friday that aren't public holidays. I have added a IsWeekend column to my date dimension table so excluding public holidays seems straight forward but I'm not quite sure how to do the hours and public holidays? 

 

I think the public holidays will be straight forwrad enough, get a list of them via API from the below site. Then add a custom column in the date dimension table to mark dates that match as a public holiday. 

Australian Public Holidays Dates Machine Readable Dataset - Datasets - data.gov.au

 

For the hours I am lost and would appreciate any input. 

 

WOW that was a lot longer then I thought it was going to be. I should probably break it up into multiple questions but then context will probably be lost. If you have read this far, thank you! Appreciate any advice, pointers or references to other materials. 

 

Thanks in advance, 

Fly Kick

4 REPLIES 4
GilbertQ
Super User
Super User

Hi @FlyKick 


Q1: What I would suggest is to create the new column with the date/time format you want. You can then delete the original column so you only have your 3 date/time columns. And yes doing this in PQ is more efficient and easier than in DAX.

 

Q2: You can use the blog post from SQLBI.COM below where you can use Calculation Groups to allow the user to switch between your 3 date/time columns. I have used this in the past and it works really well.

Using calculation groups to switch between dates - SQLBI

 

Q3: I do not have the answer for that one, and I am sure someone will be able to assist you!





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Hey Gilbert,

 

Thanks for the detailed response much appreciated. I will review in detail later today.

 

On question 1 though I woke up this morning and everything sunk in. You are correct creating the additional columns in PQ will be more effecient. However, this method does require you to create new date/time columns for every time zone your report needs to support. 

 

If I were to add more time zones to the daylight savings table in PQ and then use calculated columns in DAX to create the local date/time columns I can use a slicer to filter the daylight savings table and reference the offset value. 

 

Whilst from a performance point of view this will be slower compared with PQ it will be faster and more effecient overall as you are only duplicating each date/time column in your model once. If you were to do it in PQ you need a new date/time column for every time zone you want to support which just gets exponentially large. 

 

So in summary their doesn't seem to be a "best practice" so to speak. Both PQ and DAX have their advantages/disadvantages. It's up to the architect to think through the model, end user requirements and choose the method that will result in the best performance. 

 

I haven't tested my theory above yet so I could be completely wrong but it makes sense in my head 😂🤣 I will reply here once I have tested it and proven the theory. 

 

Cheers,

Fly Kick

For Question 3  I would strongly recommend that you do not attempt to create that calendar table in Power Query or DAX.  Instead create a Calendar table externally (in SQL Server, or a sharepoint file etc).  This will save you a lot (a lot) of sanity.  

 

Once you include business hours you also need to start thinking what to do when your users record activities outside of the standard hours. For example what should happen when someone starts working on Monday 6am or finishes on Saturday morning 10am etc.  What should the granularity be?  Is 10:15am different from 10:00 am etc,

 

If and when you have well defined rules for these cases you can then implement these rules in your DAX measures and create cross join tables with the required granularity.

Thanks for the advice lbendlin, sounds like I may be out of my depth with this one. Might have a poke around and see if I can find a a PBI consultant that might be able to do it for us 😉 Cheers 

Helpful resources

Announcements
PBI_User Group Leader_768x460.jpg

Manage your user group events

Check out the News & Announcements to learn more.

Welcome Super Users.jpg

Super User Season 2

Congratulations, the new Super User Season 2 for 2021 has started!

Community Connections 768x460.jpg

Community & How To Videos

Check out the new Power Platform Community Connections gallery!