cancel
Showing results for 
Search instead for 
Did you mean: 

Find articles, guides, information and community news

Most Recent
Jayendran Established Member
Established Member

This article is actually the continuation of my previous article where we saw how to call Power BI API's with in Power BI itself with some Powershell scripts, new API , M-queries and Automation. This is the second part where I'll explain how to create a report out of it and configure gateways & alerts and manage the alerts with the help of PowerPlatforms such as Microsoft Flow.As usual through out this article also I'll tell you lots of Tips & Tricks Smiley Wink

 

Gateway_monitoring.jpg

Read more...

Jayendran Established Member
Established Member

As a PowerBI administrator, it's always difficult to monitor the on-premises gateways within our organization, especically when the number of gateways has been growing rapidly. Today I'm going to explain how you can effecitivley administrate and monitor those gateways in PowerBI itself. Along with that, I'm going to give lots of Tips & Tricks

 

Gateway_monitoring.jpg

Read more...

LeiQian Visitor
Visitor

Today companies store huge amounts of data related to their various business processes. This data can help discover, monitor and improve your actual business process. The process of extracting process knowledge from data is called Process Mining. Process Mining can help gain better visibility, improve KPIs and eliminate bottlenecks.

 

One of the popular open source packages to help with process mining is bupaR. It is an open-source, integrated suite of R packages for the handling and analysis of business process data. It was developed by the Business Informatics research group at Hasselt University, Belgium. It currently consists of many packages which can help in calculating descriptives, process monitoring and process visualization.

02.png

The bupaR is the core package of the framework. It includes basic functionality for creating event log objects in R. It contains several functions to get information about an event log and provides specific event log versions of generic R functions. Together with the related packages, each of which has its own specific purpose, bupaR aims at supporting each step in the analysis of event data in R, from data import to online process monitoring.

 

The good news is that now PowerBI service supports bupaR visuals. Let’s explore what we can do! Our attempt here is to just quickly show a few possibilities with bupaR and PowerBI. You can read more about bupaR in some of the links below. For more information on how to create R visuals in the Power BI service, please see Creating R visuals in the Power BI service and Create Power BI visuals using R.

 

Let’s consider the scenario of patients arriving in an emergency department of a hospital. The event data in this example comes from "patients" dataset from eventdataR package. I made the sample data as a .csv file, then imported the data into PowerBI desktop and next I will show you how to use bupaR to create event logs and plot visuals from PowerBI. The data looks like below picture in PowerBI desktop. 

01.png

If you are interested to see the process map for the "completed"  patients event log, which starts with "Registration" and ends with "Check-out", you can create the R visual in the Power BI Desktop with the following R script:

script01.png

Once it gets published to Power BI service, we can see it renders as the following image.

visual01.png

If you want to see frequency in the process map, it can be created explicitly using the frequency function. The colors can be modified through the color_scale argument.

 

library(bupaR)
library(DiagrammeR)

patientsData <- dataset
patientsData$time <- as.POSIXct(patientsData$time, tz = "GMT", format = c("%Y-%m-%d %H:%M:%OS"))
x <- patientsData %>%
        eventlog(
            activity_id =  "handling",
            case_id = "patient",
            resource_id =  "employee",
            activity_instance_id =  "handling_id",
            lifecycle_id =  "registration_type",
            timestamp = "time"
        ) %>% process_map(type = frequency("relative", color_scale = "Purples"), render=FALSE)

export_graph(x, "result.png", file_type = "png")

visual02.png

 

Another example below uses Performance profile focusing on processing time of activities.

 

library(bupaR)
library(DiagrammeR)

patientsData <- dataset
patientsData$time <- as.POSIXct(patientsData$time, tz = "GMT", format = c("%Y-%m-%d %H:%M:%OS"))
x <- patientsData %>%
        eventlog(
            activity_id =  "handling",
            case_id = "patient",
            resource_id =  "employee",
            activity_instance_id =  "handling_id",
            lifecycle_id =  "registration_type",
            timestamp = "time"
        ) %>% process_map(performance(median, "days"), render=FALSE)

export_graph(x, "result.png", file_type = "png")

 

visual03.png

Different activity sequences in the event log can be visualized with trace_explorer. It can be used to explore frequent as well as infrequent traces. The coverage argument specifies how much of the log you want to explore. Below example shows the most frequent traces covering 98.5% of the event log.

 

library(bupaR)
patientsData <- dataset
patientsData$time <- as.POSIXct(patientsData$time, tz = "GMT", format = c("%Y-%m-%d %H:%M:%OS"))
patientsData %>%
    eventlog(
        activity_id =  "handling",
        case_id = "patient",
        resource_id =  "employee",
        activity_instance_id =  "handling_id",
        lifecycle_id =  "registration_type",
        timestamp = "time"
    ) %>% trace_explorer(type="frequent", coverage = 0.985)

visual04.png

The last example below shows in how many cases each of the activities is present. 

 

library(bupaR)
patientsData <- dataset
patientsData$time <- as.POSIXct(patientsData$time, tz = "GMT", format = c("%Y-%m-%d %H:%M:%OS"))
patientsData %>%
    eventlog(
        activity_id =  "handling",
        case_id = "patient",
        resource_id =  "employee",
        activity_instance_id =  "handling_id",
        lifecycle_id =  "registration_type",
        timestamp = "time"
    ) %>% activity_presence %>% plot

 

visual05.png

 

Known limitation:

The dataset in PowerBI is a dataframe. To use bupaR, you'll need to convert it to event logs as the given sample R scripts.

 

References:

1. https://en.wikipedia.org/wiki/Process_mining

2. https://www.bupar.net/index.html

3. https://www.r-bloggers.com/bupar-business-process-analysis-with-r/

 

Lei Qian  | Software Engineer at Microsoft Power BI (Artificial Intelligence) team

marekr New Member
New Member

Automated ML integration with Power BI dataflows allows training and applying Binary Prediction, General Classification and Regression models. The ML models are internally represented as specially marked dataflow entities. I’ll describe how the ML related entities are defined in M language and how they can be edited using the Power Query editor. 

Read more...

Jayendran Established Member
Established Member

Introduction

Google Location Tracking

We are all used Google location for whenever we travel into the new areas or find exactly where we are. So we simply turn on the location in our mobile. As soon as we turn on the location google will track our location with exact Latitude and Longitude. Infact google will track our lat and lon for every 3-5 seconds. Considering this amount of data for every one across the world, it's pretty big. 

 

In this article we will see how we visualize our own location data from Google using the Microsoft PowerBI Tool. So, this article is going to combine the power of two big Shots

  1. Google for Data
  2. Microsoft for Technology

Before diving into this article further let see how the final report will look like,

 

Read more...

ghamers Occasional Visitor
Occasional Visitor

Data preparation can go a long way in improving the results of machine learning models. Before getting started with AutoML in Power BI, take a moment to explore some data cleaning techniques with your data. All of the necessary tools you’ll need already exist in the Power BI ecosystem. 

 

Standardization (Relabeling) 

 

Imagine you have a text column describing college degrees, i.e. “Master’s Degree”, “Bachelor’s Degree”, etc. Depending on how the data entry was done, you might end up with values such as “M.A.”, “Masters”, and “Master’s Degree”, all meaning the same thing. By default, a machine learning model will make no assumptions about these fields being synonyms and end up treating them as unique entries. In most cases, it would be ideal if the model analyzed these varying entries the same way. 

 

Once your data is available as an entity in Power Query Online, you can remedy this discrepancy using the “Replace values” feature. To discover this functionality, simply right click your desired column header and select “Replace values”. Use the “Advanced” mode to match the entire contents of your column’s cells rather than the default partial matching. 

 

 

Standardization.PNG


Discretizing and Binning (Bucketing) 

 

With AutoML in PowerBI, it is possible to create a powerful prediction model if your entity has a True/False column. This is a column containing two distinct values indicating separate states. numeric column is the easiest type of column to convert to a True/False column. In Power Query Online, this conversion can be achieved by using a Conditional ColumnIn this example, imagine you have a numeric column of scores ranging from 0 to 100We can narrow this column to a True/False column by separating values above sixty and those below. After adding the conditional column, remember to set the column type of the new column to True/False by clicking the type icon next to the column header. 

 

 

Bucketing.PNG

 

 

Removing Outliers 

 

There are times when a numeric column may have entries that are largely different from the rest of the values in a column. In most cases, the presence of these outlier values provides little benefit for a machine learning model. Let’s say we define an outlier for a numeric column as a value falling outside two standard deviations above or below the median value of a columnIn this section we build upon the concept of using conditional columns and enhance it with a little extra Power Query magic. For your table entity, open the advanced editor: 
 

 

 

Outliers.PNG

 

 

Next locate the column for which you wish to remove outliers. In this case the column header is “Fare”. We will create two new variables to store the values of the column’s standard deviation as well as the median. 
 

 

#"Two Standard Deviations" = List.StandardDeviation(#"Changed column type"[Fare]) * 2, 
#"Medium Value" = List.Median(#"Changed column type"[Fare]), 

 

 

Now we will use a conditional column to identify an outlier by comparing the Fare value to median value plus or minus two times the standard deviation.  If the value falls outside of that range, then we set the value to the overall median value. 

 

 

#"Outliers Replaced" = Table.AddColumn(#"Changed column type", "New column", each if [Fare] < #"Medium Value" - #"Two Standard Deviations" then #"Medium Value" else if [Fare] > #"Medium Value" + #"Two Standard Deviations" then #"Medium Value" else [Fare]) 

 

 

When using this code snippet, simply replace “Fare” with the header name of your desired column. An example section of Power Query code to perform the outlier replacement follows: 

 

 

let 
    Source = Csv.Document(Web.Contents("<YOUR_CSV_SOURCE"), [Delimiter = ",", Columns = 12, QuoteStyle = QuoteStyle.None]),  
    #"Promoted headers" = Table.PromoteHeaders(Source, [PromoteAllScalars = true]),  
    #"Changed column type" = Table.TransformColumnTypes(#"Promoted headers", {{"Fare", type number}}),  
    #"Two Standard Deviations" = List.StandardDeviation(#"Changed column type"[Fare]) * 2,  
    #"Medium Value" = List.Median(#"Changed column type"[Fare]),  
    #"Outliers Replaced" = Table.AddColumn(#"Changed column type", "New column", each if [Fare] < #"Medium Value" - #"Two Standard Deviations" then #"Medium Value" else if [Fare] > #"Medium Value" + #"Two Standard Deviations" then #"Medium Value" else [Fare]) 
in 
    #"Outliers Replaced" 

 

 

With only these simple techniques at your disposal, you can start to build powerful, narrowed down machine learning models in Power BIKeep a watch out for more advanced techniques that will be covered in future posts. In the meantime, get started building models in Power BI today and get a feel for how a bit of data preparation can have a positive impact on the resulting model reports. 

 

 

Yasin Shtiui | Software Engineer II at Microsoft Power BI (Artificial Intelligence) team

Garrett Hamers | Software Engineer at Microsoft Power BI (Artificial Intelligence) team

Yuchen Visitor
Visitor

Power BI, in the latest release, added support for supervised automated machine learning.This means that Power BI can help predict ‘unknowns’ once it learns from the ‘known’ values.

Read more...

TedPattison
Advisor

So many people on the Power BI team at Microsoft have been working day and night for the last two years making the Power BI platform and its constantly-evolving feature set more competitive against Tableau, the industry leader. Those of us outside of Microsoft in the Power BI community can also help to make Power BI more competitive against Tableau by working together to increase the amount of Power BI expertise in our industry. Power BI adoption numbers are sure to increase as companies become more confident in the fact that they can find the right people with the talent and skills to design custom Power BI solutions and to manage the wide-scale distribution of content published to the Power BI service. This is why Power BI certification will become increasingly important as we move into 2018.

Read more...

EnterpriseDNA Regular Visitor
Regular Visitor

Recently I ran a free workshop for all of those connected to Enterprise DNA in some way. It was all about dashboarding and how to develop really compelling reports. Here's the reporting dashboard I showcased as I ran through my best practice visualization techniques - I'm making it available for download here along with the workshop recording

 

Multi Year Performance.JPG

 

 

 

Read more...

EnterpriseDNA Regular Visitor
Regular Visitor

There's so many applications and situations where utilising ranking techniques really add a lot of value in your analysis.
 
I've created many videos over at Enterprise DNA TV which showcase these techniques, so I thought I might review some of these here so you can learn about them more.
 
I'm going to review these techniques but don't forget you could apply these a number of different ways and in lots of situations.
 
1. Find Your Top Customers Through Time Using RANKX in Power BI w/DAX
2. Discover Top Salespeople Contribution To All Sales in Power BI w/DAX
3. Discover The Best Selling Day For Your Products w/DAX in Power BI

 

Enterprise DNA TV logo2.png

Read more...

EnterpriseDNA Regular Visitor
Regular Visitor

Through utilizing the flexibility of the Power BI data model and the power of DAX you can develop some unbelievably compelling visuals and reports. I'm going to take you through the steps you need to take to make this work well, so that you can create reports like this....I call this technique MULTI THREADED DYNAMIC VISUALS

 

dynamic vsiuals.JPG

Read more...

Super User
Super User

I have created an Infographic where you can view which options are free and which options require a Pro license. 

 

This will hopefully make it easier to understand if you use a particular option, if it will require a Pro license.

Read more...

EnterpriseDNA Regular Visitor
Regular Visitor

Using DAX to compare data or metrics over time is incredibly efficient. Using a range of techniques in this post I'll show you how to analysis performance of these metrics over multi-year time periods.

 

Multi Year Performance Analysis.JPG

Read more...

EnterpriseDNA Regular Visitor
Regular Visitor

It is amazing how much more efficient it is to complete reasonably technical analysis with Power BI. One business scenario where this will bring huge value to an organisation is when looking to understand your customers better.

Read more...

EnterpriseDNA Regular Visitor
Regular Visitor

One type of analysis I like to use quite often is comparing this year totals to last year totals. I want to also always make it as dynamic as possible. All I should have to do is change the context of my calculation (ie. bring in a new dimension like regions/products etc) and everything should automatically re-calculate.

Read more...

Helpful resources

Join Blog
Interested in blogging for the community? Let us know.