cancel
Showing results for
Did you mean:

Most Recent
Regular Visitor

## CALCULATE & CALCULATETABLE - What's The Real Difference?

I want to dive into two essential DAX functions that you absolutely need to understand well when using Power BI. These functions are CALCULATE and CALCULATETABLE. They are very similar in some aspects but also very different in others. I want to go through these with the accompanying tutorials during this post.

Member

## Be Our Friend: 2019 Season 2 Super Users!

Did you know that August 4th is International Friendship Day?  But we say, why just one day?  So here at the Power BI Community, we are declaring August "Power Platform Friendship Month!" And to celebrate, on August 14th, we will be recognizing the "2019 Season 2 Super Users!"

Regular Visitor

## How To Use Supporting Tables Within Your Data Model For Maximum Effect In Power BI

I wanted to dive into the topic of supporting tables inside of Power BI in this particular blog. Supporting tables are a unique development technique and are used for supporting logic that you might run through your core data model. Think of your core data model as the look up and fact tables that come from your raw data.

Super User

## Display Folder Property in Power BI Desktop

Are you feeling lost among a lot of columns in Power BI Desktop? Do you prefer order to disorder? Then you are at the right place.

Super User

## Solving Real Life Problems with Recursive Functions in PowerQuery

Every respected computer scientist has heard about recursive functions (more details on Wiki: https://en.wikipedia.org/wiki/Recursion). There are many algorithms which are mentioned as example of such functions, i.e. Fibonacci sequence as the simplest example:

Fib(0) = 0
Fib(1) = 1
For all integers n > 1, Fib(n) = Fib(n-1) + Fib(n-2)

How do we rewrite this pseudo code into a function in PowerQuery?

```let
fnFibonacci = (value as number) as number =>
if value = 0 then
0
else if value = 1 then
1
else
@fnFibonacci(value - 1) + @fnFibonacci(value - 2),
Source = List.Transform({0..10}, fnFibonacci)
in
Source```

The most important part is the use of @ before we call the recursive function. We tell the PowerQuery to reference its own name from inside itself.

Let’s test our function:

Well, that is great but not very useful in real life. I have been searching for long time for a scenario which can use recursive functions in a meaningful way. And I have found one!

In PowerQuery, you can select or rename many columns in one step. Moreover, you can change their data types at once. But repetitive modifications of a table are not so easy anymore.

In my ETL process written in PowerQuery, I wanted to split every datetime column in a fact table into 2 separate columns: date and time. It has at least 2 big advantages. First of all, date and time values become a foreign key for my date and time dimensions; second, the size of my dataset will decrease dramatically. This means that instead of many unique values I’ll get only 365 unique values a year for date dimension and 86 400 unique values for time dimension with the precision of one second.

What are my options? I can create a new custom column of date datatype for every datetime column and another custom column for time. Yes, I can, but I do not like doing a repetitive work. Let’s try to automate it.

Firstly, we need some test data.

```// create a table with some datetime columns
Source = Table.FromRecords(
{
[OrderID = 1, CustomerID = 1, OrderPlaced_dt = #datetime(2019,6,16,10,0,0), OrderPaid_dt = #datetime(2019,6,16,10,5,0), OrderShipped_dt = #datetime(2019,6,16,11,0,0), Price = 100.0],
[OrderID = 2, CustomerID = 1, OrderPlaced_dt = #datetime(2019,6,16,12,12,12), OrderPaid_dt = #datetime(2019,6,16,13,13,13), OrderShipped_dt = null, Price = 200.0]
},
type table[OrderID = Int64.Type, CustomerID = Int64.Type, OrderPlaced_dt = DateTime.Type, OrderPaid_dt = DateTime.Type, OrderShipped_dt = DateTime.Type, Price = Decimal.Type]
),```

Next, let’s create a function, which has 2 parameters - a source table and a name of a datetime column. This function does the same what you would do in the UI. It creates 2 new columns and removes the original one.

```    // split a datetime column into 2 columns: date and time
fnSplitDateTimeColumn = (parTbl as table, parColumnName as text) as table =>
let
// add a new column with date
transformDateExpression = Expression.Evaluate("each Date.From([" & parColumnName & "])", #shared),

// add a new column with time
transformTimeExpression = Expression.Evaluate("each try #time(Time.Hour([" & parColumnName & "]),Time.Minute([" & parColumnName & "]),Time.Second([" & parColumnName & "])) otherwise null", #shared),

// remove datetime column
in
removeDateTimeColumn,```

And finally, in the last step we create another function which is recursive. This function has also 2 parameters - a source table and a list of all datetime column names which we haven’t transformed yet. The function takes the first item from the list (a datetime column name), splits this datetime column into 2 columns, and calls itself recursively. The recursive call uses the last step as the new source table and a list of all datetime column names except the first one (which is already processed). Once the list is empty, the recursion terminates. The recursion function returns a modified source table – instead of one datetime columns there are now 2 columns.

```    // recursive function which splits all datetime columns into date and time columns
// parTbl is a source table to modify, parColumnNameList is a list of columns to split
fnSplitAllDateTimeColumns = (parTbl as table, parColumnNameList as list) as table =>
// if parColumNameList is empty, terminate the recursion
if List.IsEmpty(parColumnNameList) then
parTbl
else
let
// get one column name to process
currentColumnName = List.First(parColumnNameList),
// remove first item from the parColumnNameList
nextColumNameList = List.RemoveFirstN(parColumnNameList, 1),

// split current column
splitOneColumnTable = fnSplitDateTimeColumn(parTbl, currentColumnName),
// call itself recursively with a new created table and a shortend column name list
nextIterationTable = @fnSplitAllDateTimeColumns(splitOneColumnTable, nextColumNameList)
in
nextIterationTable,```

And everything together:

```let
// create a table with some datetime columns
Source = Table.FromRecords(
{
[OrderID = 1, CustomerID = 1, OrderPlaced_dt = #datetime(2019,6,16,10,0,0), OrderPaid_dt = #datetime(2019,6,16,10,5,0), OrderShipped_dt = #datetime(2019,6,16,11,0,0), Price = 100.0],
[OrderID = 2, CustomerID = 1, OrderPlaced_dt = #datetime(2019,6,16,12,12,12), OrderPaid_dt = #datetime(2019,6,16,13,13,13), OrderShipped_dt = null, Price = 200.0]
},
type table[OrderID = Int64.Type, CustomerID = Int64.Type, OrderPlaced_dt = DateTime.Type, OrderPaid_dt = DateTime.Type, OrderShipped_dt = DateTime.Type, Price = Decimal.Type]
),

// split a datetime column into 2 columns: date and time
fnSplitDateTimeColumn = (parTbl as table, parColumnName as text) as table =>
let
// add a new column with date
transformDateExpression = Expression.Evaluate("each Date.From([" & parColumnName & "])", #shared),

// add a new column with time
transformTimeExpression = Expression.Evaluate("each try #time(Time.Hour([" & parColumnName & "]),Time.Minute([" & parColumnName & "]),Time.Second([" & parColumnName & "])) otherwise null", #shared),

// remove datetime column
in
removeDateTimeColumn,

// recursive function which splits all datetime columns into date and time columns
// parTbl is a source table to modify, parColumnNameList is a list of columns to split
fnSplitAllDateTimeColumns = (parTbl as table, parColumnNameList as list) as table =>
// if parColumNameList is empty, terminate the recursion
if List.IsEmpty(parColumnNameList) then
parTbl
else
let
// get one column name to process
currentColumnName = List.First(parColumnNameList),
// remove first item from the parColumnNameList
nextColumNameList = List.RemoveFirstN(parColumnNameList, 1),

// split current column
splitOneColumnTable = fnSplitDateTimeColumn(parTbl, currentColumnName),
// call itself recursively with a new created table and a shortend column name list
nextIterationTable = @fnSplitAllDateTimeColumns(splitOneColumnTable, nextColumNameList)
in
nextIterationTable,

// get all columns having the datatype datetime
DateTimeColumnNames = Table.ColumnsOfType(Source, {type datetime}),
// split all datetime columns
SplitAllDateTimeColumns = fnSplitAllDateTimeColumns(Source, DateTimeColumnNames)
in
SplitAllDateTimeColumns```

And the result:

Is it a lot of code for such a simple task which you can manage in PowerQuery Editor with shiny UI? Yes and no. If you have many fact tables with a lot of datetime columns and you do an ETL, you’ll be very happy having the opportunity to automate that. But if you have just one table with 2 datetime columns, it is an overkill.

Do you know any other pragmatic use case for recursive functions in PowerQuery? Please, let me know down in the comments 😊

Frequent Visitor

## Cost-Benefit Analysis using machine learning in Power BI

Imagine you are the owner of a local restaurant and you’ve had modest success with a marketing coupon campaign in the past. You sent out some coupons through the local mail and ended up noticing an uptick in business over the next few weeks. Now you’re wondering which customers you should target next. Additionally, you’d like to know if there’s anything you can do to analyze the coupon campaign with the goal of maximizing benefits and revenue while minimizing costs. Cost-benefit analysis, further referred to as CBA, is one such analytical tool. Now when building binary prediction models using AutoML in Power BI, the resulting report allows you to use CBA with your data.

Super User

## Vertipaq Analyzer – How to Connect to Power BI Desktop

If you have a dataset in Power BI Desktop and you want to see what happens under the roof of the Vertipaq Engine, there is a great tool known as Vertipaq Analyzer. It can tell you how much space tables and columns consume, what their data types are like, what compress algorithms they use and so on. But before you can start analyzing your dataset, you need to set everything up.

Super User

## Review of new features & Announcements for #PowerBI – From the Microsoft Business Applications Summi

A recap of all the new features and announcements from the Microsoft Business Applications Summit

Member

## On demand now — full session lineup from Microsoft Business Applications Summit!

The conference might be over, but there’s so much more in store! Get ready to transform your business with the latest in Microsoft Business Applications. Explore 200+ sessions, workshops, and keynotes from Microsoft Business Applications Summit, available now in the Power BI Community.

Member

## Watch Microsoft Business Applications Summit sessions on-demand!

The summit might be over, but there’s so much more to explore!

Super User

## Think inside the box - Custom visuals using Standard Visuals

Custom visuals are great and there is a variety of solutions that they help us solve.

But can you emulate custom visuals without importing them from the store?

Occasional Visitor

## Azure Machine Learning in Power BI Dataflows

Harness the power of Azure Machine Learning in Power BI

Member

## Watch Live: Microsoft Business Applications Summit Opening Keynote

Can’t make it to Microsoft Business Applications Summit this year? Don’t worry, we’ve got you covered – catch the opening keynote live from Atlanta, Georgia June 10 at 8:30 a.m. EDT

Member

## Visual BI’s ValQ Launch Experiences the Power of Microsoft Power BI People, Platform and Partnership

On May 17th Friday afternoon, Visual BI’s ValQ for modern digital planning was introduced to the world through Microsoft’s May 2019 Power BI Desktop summary blog. This made the week a blockbuster for ValQ (With 1,000+ product downloads and 100K+ views) and the Power BI user community (as per their strongly positive feedback and excitement in social media).

Super User

Hello to all,

One question I see currently showing up on the forum is to have different header colours, but most of the times since the conditional formatting is only applied to values people tend to say it's not possible.

As refer conditional formatting cannot  be applied to a column header however you there are alowed to:

• Apply the same format of the column values to the header
• Apply conditional formatting to the values (text and/or background)

Member

## Last chance! Register for Microsoft Business Applications Summit today.

Make your plan and embrace what’s next for your business. Don’t miss Microsoft Business Applications Summit, coming to Atlanta, Georgia, June 10 – 11, 2019. We’re gearing up for an amazing 2+ days filled fresh approaches, the latest technologies, and new ways to power digital transformation at every level.

Frequent Visitor

## [Support-Info] Automatic Updates of Power BI Desktop

All,

Hello.  Tim Macaulay here again.  SR. Support Escalation Engineer on the Power BI Support Team here at Microsoft.  Recently had an interesting discussion around the topic of "automating the installation of the Power BI Desktop" product.    Let's talk about it.  Please let me know if you have any questions and/or feedback.

# PRODUCT FOCUS / PRODUCTS INVOLVED

Microsoft Power BI Desktop

# THE END GOAL / PROBLEM SCENARIO DESCRIPTION

In this discussion, we are looking for a way to automatically update the Power BI Desktop product without end-user intervention.

1. Is this possible?  If so, how can one achieve the automatic update?
2. What kind of special permissions are needed to bring this to success?

# DISCUSSION TOPICS

## Power BI Desktop Store App

The Power BI Desktop Store App utilizes the Windows 10 App Installer to execute the installation.  The key to note about the App Installer, is that it does not utilize Programs and Features inside of Control Panel.

That stated, when the Power BI Desktop Store App is updated, it is then pushed down and updates the installed Power BI Desktop App, which will not require any interaction by the end-user.  Additionally, it does not require any administrative permissions to execute the update.

The Power BI Download is a Windows Installer based install and is deployed each month with a MSI (Microsoft Installer) file.  The Windows Installer based install does add the application to the Programs and Features in the Control Panel.

At this point and time, there is no Out of the Box way to automatically update the Power BI Download installation.  In order to update the Power BI application that has been downloaded, you must first uninstall the current Power BI and then install the new Power BI with the new MSI file.

The question is how does one uninstall / install to create an automated installation.  To achieve this, one can utilize the msiexec.exe file to execute the uninstall / install via command-line.

## Windows Guidance and Examples

Here, find information on syntax and examples.

One can discover information about the command-line syntax here at the Microsoft Documentation on the Windows Installer Command-Line: https://docs.microsoft.com/en-us/windows/desktop/msi/command-line-options

One can utilize the

• /x switch to uninstall the product
• /i to install the product
• /q for quiet mode

Uninstall

Msiexec /x < GUID from the uninstall key > /qn

NOTE

You can find the GUID for the Power BI Product in the Uninstall Key

HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Uninstall

 NOTE To obtain the GUID Key from the registry, one could use PowerShell to iterate through the Uninstall Registry key to obtain the information.There are some great blogs out there that can assist with such a task. The snapshot below is just an illustration.  The GUID might be different

Install

Msiexec /I "<Location and filename of the Power BI MSI>" /qn ACCEPT-EULA = 1

In many cases, companies would do things like utilize deployment software like Microsoft System Center Configuration Manager.

## Meet Community stars at Microsoft Business Applications Summit!

One of the primary reasons people go to conferences is to meet the presenters.

At the Microsoft Business Applications Summit (MBAS), we have an entire “Ask the Experts” experience to make this easy.

Frequent Visitor

## [SUPPORT-INFO]: Export-PowerBIReport returns: invalid status code 'Unauthorized'

Error message received when executing the PowerShell cmdlet Export-PowerBIReport

Member

## Microsoft Business Applications Summit: Top 10 reasons to attend!

We can't wait for Microsoft Business Applications Summit this year – and it’s almost here! Coming to Atlanta, Georgia June 10 – 11, this is your chance to get hands-on with your favorite tools, go one-on-one with experts and engineers, and collaborate with our vibrant community of data drivers.

Frequent Visitor

## [HOWTO] Export Office 365 Audit Logs with focus on Power BI using PowerShell

PRODUCTS / COMPONENTS INVOLVED

• Microsoft Power BI
• Microsoft Office 365 - Audit Logs
• PowerShell

PROBLEM SCENARIO DESCRIPTION / GOAL

The goal here, is to look for a remote way that one can export the Office 365 Audit Log with a focus on Power BI for a given data range.

RESOLUTION ( SCRIPT )

The below script I was able to get to work in my environment.  It is very simple to provide an illustration of how to achieve the given goal.

NOTE: The following script is provided as a sample / illustration.

 Script Block Set-ExecutionPolicy RemoteSigned\$UserCredential = Get-Credential#the link specified here is strictly for use in the powershell and you will need to authenticate to use it\$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential \$UserCredential -Authentication Basic -AllowRedirectionImport-PSSession \$SessionSearch-UnifiedAuditLog -StartDate 4/11/2019 -EndDate 5/9/2019 -RecordType PowerBI -ResultSize 1000 | Format-Table | More

Here is product documentaton on the Search-UnifiedAuditLog PowerShell cmdlet:  https://docs.microsoft.com/en-us/powershell/module/exchange/policy-and-compliance-audit/search-unifi...

Member

## How to create a support ticket in Power BI

If you are a Power BI Pro user and looking for some support from Microsoft Power BI support team, raise a ticket.

1. Go to http://support.powerbi.com
2. Scroll towards the bottom of this Page. Under Still having issues, click CREATE SUPPORT TICKET.

Note: Pro Users get free support.

1. Once you click on it, you will be re-directed to https://powerbi.microsoft.com/en-us/support/pro/
2. You can search for the issue. If the issue is already listed, it will display. It will also give you an option to SUBMIT A TICKET

3.  Click on CONTINUE TO SUPPORT TICKET. You will be redirected to https://powerbi.microsoft.com/en-us/support/pro/ticket/

Once you submit the ticket, you will get an email from support@mail.support.microsoft.com. Someone from the  Microsoft support team will contact you.

Member

## Power BI helps Improve Ocean Health!

Ashleigh McGovern develops strategy, raises money, and crafts communications for the Center for Oceans at Conservation International (CI). Among other things, she’s responsible for driving innovative solutions with CI's global teams, partners, and donors to improve ocean health.

Spoiler

Member

## Live now, full session catalog for Microsoft Business Applications Summit 2019

Time to get excited – the full session catalog for Microsoft Business Applications Summit is here. Explore every breakout session and workshop coming to the conference, taking place in Atlanta, Georgia June 10 – 11, 2019. Get ready to flex your skills – and build new ones – with all things Power BI and beyond.

Member

## Data Profiling in Power BI (Power BI Update April 2019)

As per the April 2019 update, Microsoft has introduced a data profiling capability in Power BI desktop.
Data profiling helps us easily find the issues with our imported data from data sources in to Power BI.

Frequent Visitor

## Tips and Tricks for Automated Machine Learning reports

Recently we announced the capability of creating Machine Learning models directly in PowerBI. This is powered by automated machine learning from Azure.

You can create an ML model based on your data with a few simple steps using this new capability, so this post aims to provide some light on the next steps, model performance and application, and provide a few tips about the reports.

Visitor

## Building a Regression Model with zero code in PowerBI

I love using PowerBI for analysis! Recently PowerBI added support for AutoML (Supervised Automated machine Learning) and I wanted to demonstrate the new ML capabilities with an end to end example. In this article, I use AutoML to create and apply a regression model. Regression is a machine learning technique used for understanding the relationship between input and output variables (input is what is known in your data and output is generally the effect of input data and is sometimes unknown). Regression technique gives us more insight into data by making it easier to see relation between dependent and independent variables. These are the five steps I followed in this article to create the Regression model:

Refer this tutorial  for a detailed walk through of each of  these steps, and this tutorial for ones specific to regression models.

## Creating a dataflow with the input data

I created a dataflow using the House Sales prediction dataset from Kaggle. It is also available, at the following link: house sales prediction for purposes of this article. It contains sale prices, number of bedrooms, square footage, number of floors, latitude, longitude etc. of houses in King County (which includes Seattle) sold between May 2014 and May 2015. The Price attribute indicating the price of the house is the numeric field used as label for the Regression model. The dataset was split into two entities HousePrices2014 and HousePrices2015 based on the year attribute.

Note that though most of the attribute data types in this dataset are numeric, these may be interpreted as strings by Power Query. So, most of these fields, including the label, were converted to numeric.

## Training a machine learning model

I chose the HousePrices2014 as the entity to apply ML models. Power BI analyzed the Price field and suggested Regression as the type of machine learning model that can be created to predict that field. It took around an impressive 7 minutes to train 12 algorithms on 14 thousand data points.

Model validation report

After the training refresh completed, I checked out the training report. Key Influencers shows that Latitude, Sqft_living, Grade are the top predictors for the model.

To verify this, I plotted a ft_living (square footage of the home) and price. The plot does indeed show a relationship indicating that that higher square feet living indicates higher prices. As shown in the sqft_living breakdown above, 2.7k to 2.9k sqfeet houses has the highest share of range 1 priced houses (72k to 1M).

Likewise, a scatterplot between grade (overall grade given to the housing unit, based on King County grading system) and price shows houses with higher grades does have higher prices.

The model is 88% performant, which is a pretty great value for this dataset. As you can see in the chart below, most of the test samples lie in the diagonal showing the low error in prediction.

The Training Details page shows the performance of the 25 different models, how features were extracted from the inputs, and the hyperparameters for the final model used. AutoML chose an ensemble, Pre fitted Soft Voting Regressor as the best model for this dataset.

## Applying the model

As I was satisfied with the training results, I proceeded to apply the model to the HousePrices2015 entity. A HousePrices2015 enriched RegressionModel was created, which includes the predicted output from the model. Overall, it took only around 2 minutes to score nearly 7 thousand rows!

Applying a Regression model added two columns with the Regression outcome, and the top record-specific influencers for each prediction. The HousePrices2015 enriched Regression entity with explanation, predicted prices is shown below along with the actual prices in PowerBI Desktop. The local explanations contains break down of how much each feature moved the specific instance away from the average price of houses. For categorical features, it shows how much price would have changed for different values of the categories. In the highlighted example, you can see that the house price was predicted to be \$554174. The explanation says that, "condition", caused the price to increase by 44k from the average price of houses, and that the "latitude" caused the price to fall down by 26k.

I established a connection to the dataflow from PBI Desktop, to incorporate these predictions from the model in PBI reports so that I can visualize these results. I plotted a heat map of predicted prices using latitude, longitude. Prices were represented by a red (100% gradient stop) through green (50% gradient stop), to yellow(0% gradient stop) gradient.

We can see that, for example, Downtown Seattle and Downtown Bellevue have red regions as expected, owing to high prices.

I tried a scikit-learn Random Forest on this same dataset. It took 1 min using 3-fold cross validation, searching across 30 different combinations, using all available cores to give an R2 score of 0.85.

## Summary

Even though AutoML took longer, I am impressed to have gotten a better R2 score of 0.88 by trying not just Random Forest but 25 different models, in addition to getting instance-level explanations. Moreover, AutoML automatically does preprocessing like dropping features like ID with no useful information and generating additional features for datetime like month, year etc. On the other hand, in scikit-learn one must encode string or date-time like features or drop it.

Overall, I’m very excited about AutoML, because I did not have to learn the intricacies of the ML models nor worry about the implementation and still get the same results. I have done these in Python and R and it took me considerably longer to code up and setup the hyperparameters, whereas in AutoML I am just hitting a bunch of buttons and voila I am done! Given how much easier it is to get started with lesser effort, I encourage business analysts and even data scientists/analysts to try out AutoML!

Member

## "Mayor of the Internet” Alexis Ohanian to keynote Microsoft Business Applications Summit

We’re so excited to announce Alexis Ohanian as our special guest keynote speaker for Microsoft Business Applications Summit, coming to Atlanta, Georgia June 10 – 11, 2019.

Visitor

## Dynamic Hideable Columns in Power BI Table Visual

Power BI's slicers provide a powerful way to hide rows in a table - but no built-in feature allows columns to be shuffled and sliced in a similar way. That doesn't mean it can't be implemented, however, and in a way that is close to seamless for an end user. Using a layer of measures, we can abstract the columns of raw data from the displayed columns, and allow columns to be dynamically displayed in any position.

Visitor

## Business Process Analysis in PowerBI using R visuals

Today companies store huge amounts of data related to their various business processes. This data can help discover, monitor and improve your actual business process. The process of extracting process knowledge from data is called Process Mining. Process Mining can help gain better visibility, improve KPIs and eliminate bottlenecks.

One of the popular open source packages to help with process mining is bupaR. It is an open-source, integrated suite of R packages for the handling and analysis of business process data. It was developed by the Business Informatics research group at Hasselt University, Belgium. It currently consists of many packages which can help in calculating descriptives, process monitoring and process visualization.

The bupaR is the core package of the framework. It includes basic functionality for creating event log objects in R. It contains several functions to get information about an event log and provides specific event log versions of generic R functions. Together with the related packages, each of which has its own specific purpose, bupaR aims at supporting each step in the analysis of event data in R, from data import to online process monitoring.

The good news is that now PowerBI service supports bupaR visuals. Let’s explore what we can do! Our attempt here is to just quickly show a few possibilities with bupaR and PowerBI. You can read more about bupaR in some of the links below. For more information on how to create R visuals in the Power BI service, please see Creating R visuals in the Power BI service and Create Power BI visuals using R.

Let’s consider the scenario of patients arriving in an emergency department of a hospital. The event data in this example comes from "patients" dataset from eventdataR package. I made the sample data as a .csv file, then imported the data into PowerBI desktop and next I will show you how to use bupaR to create event logs and plot visuals from PowerBI. The data looks like below picture in PowerBI desktop.

If you are interested to see the process map for the "completed"  patients event log, which starts with "Registration" and ends with "Check-out", you can create the R visual in the Power BI Desktop with the following R script:

Once it gets published to Power BI service, we can see it renders as the following image.

If you want to see frequency in the process map, it can be created explicitly using the frequency function. The colors can be modified through the color_scale argument.

```library(bupaR)
library(DiagrammeR)

patientsData <- dataset
patientsData\$time <- as.POSIXct(patientsData\$time, tz = "GMT", format = c("%Y-%m-%d %H:%M:%OS"))
x <- patientsData %>%
eventlog(
activity_id =  "handling",
case_id = "patient",
resource_id =  "employee",
activity_instance_id =  "handling_id",
lifecycle_id =  "registration_type",
timestamp = "time"
) %>% process_map(type = frequency("relative", color_scale = "Purples"), render=FALSE)
export_graph(x, "result.png", file_type = "png")```

Another example below uses Performance profile focusing on processing time of activities.

```library(bupaR)
library(DiagrammeR)

patientsData <- dataset
patientsData\$time <- as.POSIXct(patientsData\$time, tz = "GMT", format = c("%Y-%m-%d %H:%M:%OS"))
x <- patientsData %>%
eventlog(
activity_id =  "handling",
case_id = "patient",
resource_id =  "employee",
activity_instance_id =  "handling_id",
lifecycle_id =  "registration_type",
timestamp = "time"
) %>% process_map(performance(median, "days"), render=FALSE)

export_graph(x, "result.png", file_type = "png")```

Different activity sequences in the event log can be visualized with trace_explorer. It can be used to explore frequent as well as infrequent traces. The coverage argument specifies how much of the log you want to explore. Below example shows the most frequent traces covering 98.5% of the event log.

```library(bupaR)
patientsData <- dataset
patientsData\$time <- as.POSIXct(patientsData\$time, tz = "GMT", format = c("%Y-%m-%d %H:%M:%OS"))
patientsData %>%
eventlog(
activity_id =  "handling",
case_id = "patient",
resource_id =  "employee",
activity_instance_id =  "handling_id",
lifecycle_id =  "registration_type",
timestamp = "time"
) %>% trace_explorer(type="frequent", coverage = 0.985)```

The last example below shows in how many cases each of the activities is present.

```library(bupaR)
patientsData <- dataset
patientsData\$time <- as.POSIXct(patientsData\$time, tz = "GMT", format = c("%Y-%m-%d %H:%M:%OS"))
patientsData %>%
eventlog(
activity_id =  "handling",
case_id = "patient",
resource_id =  "employee",
activity_instance_id =  "handling_id",
lifecycle_id =  "registration_type",
timestamp = "time"
) %>% activity_presence %>% plot```

Known limitation:

The dataset in PowerBI is a dataframe. To use bupaR, you'll need to convert it to event logs as the given sample R scripts.

References:

Lei Qian  | Software Engineer at Microsoft Power BI (Artificial Intelligence) team

Join Blog
Interested in blogging for the community? Let us know.

Latest Articles
Archives