cancel
Showing results for 
Search instead for 
Did you mean: 

Find articles, guides, information and community news

Most Recent
Jayendran Established Member
Established Member

This article is actually the continuation of my previous article where we saw how to call Power BI API's with in Power BI itself with some Powershell scripts, new API , M-queries and Automation. This is the second part where I'll explain how to create a report out of it and configure gateways & alerts and manage the alerts with the help of PowerPlatforms such as Microsoft Flow.As usual through out this article also I'll tell you lots of Tips & Tricks Smiley Wink

 

Gateway_monitoring.jpg

Read more...

Jayendran Established Member
Established Member

As a PowerBI administrator, it's always difficult to monitor the on-premises gateways within our organization, especically when the number of gateways has been growing rapidly. Today I'm going to explain how you can effecitivley administrate and monitor those gateways in PowerBI itself. Along with that, I'm going to give lots of Tips & Tricks

 

Gateway_monitoring.jpg

Read more...

Super User
Super User

We all know how important it is to transform our data into a star schema and to have a fact table and dimension tables. Date dimension is one of those dimensions. We will build one together!

Read more...

EnterpriseDNA Regular Visitor
Regular Visitor

I want to dive into two essential DAX functions that you absolutely need to understand well when using Power BI. These functions are CALCULATE and CALCULATETABLE. They are very similar in some aspects but also very different in others. I want to go through these with the accompanying tutorials during this post.

Power BI Community.png

Read more...

jfeil Member
Member

Did you know that August 4th is International Friendship Day?  But we say, why just one day?  So here at the Power BI Community, we are declaring August "Power Platform Friendship Month!" And to celebrate, on August 14th, we will be recognizing the "2019 Season 2 Super Users!"

Read more...

EnterpriseDNA Regular Visitor
Regular Visitor

I wanted to dive into the topic of supporting tables inside of Power BI in this particular blog. Supporting tables are a unique development technique and are used for supporting logic that you might run through your core data model. Think of your core data model as the look up and fact tables that come from your raw data.


How-To-Use-Supporting-Tables-Within-Your-Data-Model-For-Maximum-Effect-In-Power-BI.png

 

Read more...

Super User
Super User

Are you feeling lost among a lot of columns in Power BI Desktop? Do you prefer order to disorder? Then you are at the right place.

Read more...

Super User
Super User

Every respected computer scientist has heard about recursive functions (more details on Wiki: https://en.wikipedia.org/wiki/Recursion). There are many algorithms which are mentioned as example of such functions, i.e. Fibonacci sequence as the simplest example:

Fib(0) = 0
Fib(1) = 1
For all integers n > 1, Fib(n) = Fib(n-1) + Fib(n-2)

 

How do we rewrite this pseudo code into a function in PowerQuery?

let
    fnFibonacci = (value as number) as number =>
        if value = 0 then
            0
        else if value = 1 then
            1
        else
            @fnFibonacci(value - 1) + @fnFibonacci(value - 2),
    Source = List.Transform({0..10}, fnFibonacci)
in
    Source

 

The most important part is the use of @ before we call the recursive function. We tell the PowerQuery to reference its own name from inside itself.

Let’s test our function:

Screenshot1.PNG

 

Well, that is great but not very useful in real life. I have been searching for long time for a scenario which can use recursive functions in a meaningful way. And I have found one!

In PowerQuery, you can select or rename many columns in one step. Moreover, you can change their data types at once. But repetitive modifications of a table are not so easy anymore.

 

In my ETL process written in PowerQuery, I wanted to split every datetime column in a fact table into 2 separate columns: date and time. It has at least 2 big advantages. First of all, date and time values become a foreign key for my date and time dimensions; second, the size of my dataset will decrease dramatically. This means that instead of many unique values I’ll get only 365 unique values a year for date dimension and 86 400 unique values for time dimension with the precision of one second.

What are my options? I can create a new custom column of date datatype for every datetime column and another custom column for time. Yes, I can, but I do not like doing a repetitive work. Let’s try to automate it.

 

Firstly, we need some test data.

// create a table with some datetime columns
    Source = Table.FromRecords(
        {  
            [OrderID = 1, CustomerID = 1, OrderPlaced_dt = #datetime(2019,6,16,10,0,0), OrderPaid_dt = #datetime(2019,6,16,10,5,0), OrderShipped_dt = #datetime(2019,6,16,11,0,0), Price = 100.0],
            [OrderID = 2, CustomerID = 1, OrderPlaced_dt = #datetime(2019,6,16,12,12,12), OrderPaid_dt = #datetime(2019,6,16,13,13,13), OrderShipped_dt = null, Price = 200.0]  
        },
        type table[OrderID = Int64.Type, CustomerID = Int64.Type, OrderPlaced_dt = DateTime.Type, OrderPaid_dt = DateTime.Type, OrderShipped_dt = DateTime.Type, Price = Decimal.Type]
    ),

 

Next, let’s create a function, which has 2 parameters - a source table and a name of a datetime column. This function does the same what you would do in the UI. It creates 2 new columns and removes the original one.

    // split a datetime column into 2 columns: date and time
    fnSplitDateTimeColumn = (parTbl as table, parColumnName as text) as table =>
        let
            // add a new column with date
            transformDateExpression = Expression.Evaluate("each Date.From([" & parColumnName & "])", #shared),
            addDateColumn = Table.AddColumn(parTbl, parColumnName & "_date", transformDateExpression, type date),

            // add a new column with time
            transformTimeExpression = Expression.Evaluate("each try #time(Time.Hour([" & parColumnName & "]),Time.Minute([" & parColumnName & "]),Time.Second([" & parColumnName & "])) otherwise null", #shared),
            addTimeColumn = Table.AddColumn(addDateColumn, parColumnName & "_time", transformTimeExpression, type time),

            // remove datetime column
            removeDateTimeColumn = Table.RemoveColumns(addTimeColumn, parColumnName)
        in
            removeDateTimeColumn,

 

And finally, in the last step we create another function which is recursive. This function has also 2 parameters - a source table and a list of all datetime column names which we haven’t transformed yet. The function takes the first item from the list (a datetime column name), splits this datetime column into 2 columns, and calls itself recursively. The recursive call uses the last step as the new source table and a list of all datetime column names except the first one (which is already processed). Once the list is empty, the recursion terminates. The recursion function returns a modified source table – instead of one datetime columns there are now 2 columns.

    // recursive function which splits all datetime columns into date and time columns
    // parTbl is a source table to modify, parColumnNameList is a list of columns to split
    fnSplitAllDateTimeColumns = (parTbl as table, parColumnNameList as list) as table =>
        // if parColumNameList is empty, terminate the recursion
        if List.IsEmpty(parColumnNameList) then
            parTbl
        else
            let
                // get one column name to process
                currentColumnName = List.First(parColumnNameList),
                // remove first item from the parColumnNameList
                nextColumNameList = List.RemoveFirstN(parColumnNameList, 1),

                // split current column
                splitOneColumnTable = fnSplitDateTimeColumn(parTbl, currentColumnName),
                // call itself recursively with a new created table and a shortend column name list
                nextIterationTable = @fnSplitAllDateTimeColumns(splitOneColumnTable, nextColumNameList)
            in
                nextIterationTable,

And everything together:

let
    // create a table with some datetime columns
    Source = Table.FromRecords(
        {  
            [OrderID = 1, CustomerID = 1, OrderPlaced_dt = #datetime(2019,6,16,10,0,0), OrderPaid_dt = #datetime(2019,6,16,10,5,0), OrderShipped_dt = #datetime(2019,6,16,11,0,0), Price = 100.0],
            [OrderID = 2, CustomerID = 1, OrderPlaced_dt = #datetime(2019,6,16,12,12,12), OrderPaid_dt = #datetime(2019,6,16,13,13,13), OrderShipped_dt = null, Price = 200.0]  
        },
        type table[OrderID = Int64.Type, CustomerID = Int64.Type, OrderPlaced_dt = DateTime.Type, OrderPaid_dt = DateTime.Type, OrderShipped_dt = DateTime.Type, Price = Decimal.Type]
    ),

    // split a datetime column into 2 columns: date and time
    fnSplitDateTimeColumn = (parTbl as table, parColumnName as text) as table =>
        let
            // add a new column with date
            transformDateExpression = Expression.Evaluate("each Date.From([" & parColumnName & "])", #shared),
            addDateColumn = Table.AddColumn(parTbl, parColumnName & "_date", transformDateExpression, type date),

            // add a new column with time
            transformTimeExpression = Expression.Evaluate("each try #time(Time.Hour([" & parColumnName & "]),Time.Minute([" & parColumnName & "]),Time.Second([" & parColumnName & "])) otherwise null", #shared),
            addTimeColumn = Table.AddColumn(addDateColumn, parColumnName & "_time", transformTimeExpression, type time),

            // remove datetime column
            removeDateTimeColumn = Table.RemoveColumns(addTimeColumn, parColumnName)
        in
            removeDateTimeColumn,

    // recursive function which splits all datetime columns into date and time columns
    // parTbl is a source table to modify, parColumnNameList is a list of columns to split
    fnSplitAllDateTimeColumns = (parTbl as table, parColumnNameList as list) as table =>
        // if parColumNameList is empty, terminate the recursion
        if List.IsEmpty(parColumnNameList) then
            parTbl
        else
            let
                // get one column name to process
                currentColumnName = List.First(parColumnNameList),
                // remove first item from the parColumnNameList
                nextColumNameList = List.RemoveFirstN(parColumnNameList, 1),

                // split current column
                splitOneColumnTable = fnSplitDateTimeColumn(parTbl, currentColumnName),
                // call itself recursively with a new created table and a shortend column name list
                nextIterationTable = @fnSplitAllDateTimeColumns(splitOneColumnTable, nextColumNameList)
            in
                nextIterationTable,

    // get all columns having the datatype datetime
    DateTimeColumnNames = Table.ColumnsOfType(Source, {type datetime}),
    // split all datetime columns
    SplitAllDateTimeColumns = fnSplitAllDateTimeColumns(Source, DateTimeColumnNames)
in
    SplitAllDateTimeColumns

 

And the result:

Screenshot2.PNG

 

Is it a lot of code for such a simple task which you can manage in PowerQuery Editor with shiny UI? Yes and no. If you have many fact tables with a lot of datetime columns and you do an ETL, you’ll be very happy having the opportunity to automate that. But if you have just one table with 2 datetime columns, it is an overkill.

 

Do you know any other pragmatic use case for recursive functions in PowerQuery? Please, let me know down in the comments 😊

YasinShtiui Frequent Visitor
Frequent Visitor

Imagine you are the owner of a local restaurant and you’ve had modest success with a marketing coupon campaign in the past. You sent out some coupons through the local mail and ended up noticing an uptick in business over the next few weeks. Now you’re wondering which customers you should target next. Additionally, you’d like to know if there’s anything you can do to analyze the coupon campaign with the goal of maximizing benefits and revenue while minimizing costs. Cost-benefit analysis, further referred to as CBA, is one such analytical tool. Now when building binary prediction models using AutoML in Power BI, the resulting report allows you to use CBA with your data.

Read more...

Super User
Super User

If you have a dataset in Power BI Desktop and you want to see what happens under the roof of the Vertipaq Engine, there is a great tool known as Vertipaq Analyzer. It can tell you how much space tables and columns consume, what their data types are like, what compress algorithms they use and so on. But before you can start analyzing your dataset, you need to set everything up.

Read more...

Super User
Super User

A recap of all the new features and announcements from the Microsoft Business Applications Summit

Read more...

jfeil Member
Member

The conference might be over, but there’s so much more in store! Get ready to transform your business with the latest in Microsoft Business Applications. Explore 200+ sessions, workshops, and keynotes from Microsoft Business Applications Summit, available now in the Power BI Community.

Read more...

jfeil Member
Member

The summit might be over, but there’s so much more to explore!

Read more...

Super User
Super User

Custom visuals are great and there is a variety of solutions that they help us solve.

 

From simple table formatting options to advance analytics with forecasting neural networking.

 

But can you emulate custom visuals without importing them from the store?

 

Read more...

sagarnarla Occasional Visitor
Occasional Visitor

Harness the power of Azure Machine Learning in Power BI

Read more...

jfeil Member
Member

Can’t make it to Microsoft Business Applications Summit this year? Don’t worry, we’ve got you covered – catch the opening keynote live from Atlanta, Georgia June 10 at 8:30 a.m. EDTBusiness_Applications_Summit_Hero_Image.jpg

Read more...

jfeil Member
Member

On May 17th Friday afternoon, Visual BI’s ValQ for modern digital planning was introduced to the world through Microsoft’s May 2019 Power BI Desktop summary blog. This made the week a blockbuster for ValQ (With 1,000+ product downloads and 100K+ views) and the Power BI user community (as per their strongly positive feedback and excitement in social media).

Read more...

Super User
Super User

Hello to all,

 

One question I see currently showing up on the forum is to have different header colours, but most of the times since the conditional formatting is only applied to values people tend to say it's not possible.

 

As refer conditional formatting cannot  be applied to a column header however you there are alowed to:

  • Apply the same format of the column values to the header
  • Apply conditional formatting to the values (text and/or background)
Read more...

jfeil Member
Member

Business_Applications_Summit_Hero_Image.jpgMake your plan and embrace what’s next for your business. Don’t miss Microsoft Business Applications Summit, coming to Atlanta, Georgia, June 10 – 11, 2019. We’re gearing up for an amazing 2+ days filled fresh approaches, the latest technologies, and new ways to power digital transformation at every level.

Read more...

timmac Frequent Visitor
Frequent Visitor

All,

Hello.  Tim Macaulay here again.  SR. Support Escalation Engineer on the Power BI Support Team here at Microsoft.  Recently had an interesting discussion around the topic of "automating the installation of the Power BI Desktop" product.    Let's talk about it.  Please let me know if you have any questions and/or feedback.

 

PRODUCT FOCUS / PRODUCTS INVOLVED

Microsoft Power BI Desktop

 

 

THE END GOAL / PROBLEM SCENARIO DESCRIPTION

In this discussion, we are looking for a way to automatically update the Power BI Desktop product without end-user intervention. 
 

  1. Is this possible?  If so, how can one achieve the automatic update?
  2. What kind of special permissions are needed to bring this to success?
     

 

 

DISCUSSION TOPICS


 

Power BI Desktop Store App

The Power BI Desktop Store App utilizes the Windows 10 App Installer to execute the installation.  The key to note about the App Installer, is that it does not utilize Programs and Features inside of Control Panel.

That stated, when the Power BI Desktop Store App is updated, it is then pushed down and updates the installed Power BI Desktop App, which will not require any interaction by the end-user.  Additionally, it does not require any administrative permissions to execute the update.

One can discover more information here: https://docs.microsoft.com/en-us/power-bi/desktop-get-the-desktop#install-as-an-app-from-the-microso...

 

  1. Enable Automatic Updates: https://support.microsoft.com/en-ca/help/15081/windows-turn-on-automatic-app-updates
  2. Download Power BI Desktop from the Microsoft Store: https://powerbi.microsoft.com/en-us/desktop/


 

 

 

Power BI Download

(Download from Internet)

The Power BI Download is a Windows Installer based install and is deployed each month with a MSI (Microsoft Installer) file.  The Windows Installer based install does add the application to the Programs and Features in the Control Panel.

At this point and time, there is no Out of the Box way to automatically update the Power BI Download installation.  In order to update the Power BI application that has been downloaded, you must first uninstall the current Power BI and then install the new Power BI with the new MSI file.

 

The question is how does one uninstall / install to create an automated installation.  To achieve this, one can utilize the msiexec.exe file to execute the uninstall / install via command-line.

 

Windows Guidance and Examples

Here, find information on syntax and examples.

 

One can discover information about the command-line syntax here at the Microsoft Documentation on the Windows Installer Command-Line: https://docs.microsoft.com/en-us/windows/desktop/msi/command-line-options
 

One can utilize the

  • /x switch to uninstall the product
  • /i to install the product
  • /q for quiet mode

 

Uninstall

Msiexec /x < GUID from the uninstall key > /qn

 

NOTE

You can find the GUID for the Power BI Product in the Uninstall Key

HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Uninstall

NOTE

To obtain the GUID Key from the registry, one could use PowerShell to iterate through the Uninstall Registry key to obtain the information.

There are some great blogs out there that can assist with such a task.

 

The snapshot below is just an illustration.  The GUID might be different

clip_image001.jpg

 


 

Install

Msiexec /I "<Location and filename of the Power BI MSI>" /qn ACCEPT-EULA = 1


 

 

 

In many cases, companies would do things like utilize deployment software like Microsoft System Center Configuration Manager. 


 

 

 

 

LINKS / RESOURCES / ADDITIONAL INFORMATION

Administrator chass
Administrator

One of the primary reasons people go to conferences is to meet the presenters.

At the Microsoft Business Applications Summit (MBAS), we have an entire “Ask the Experts” experience to make this easy.

Read more...

timmac Frequent Visitor
Frequent Visitor

Error message received when executing the PowerShell cmdlet Export-PowerBIReport

Read more...

jfeil Member
Member

We can't wait for Microsoft Business Applications Summit this year – and it’s almost here! Coming to Atlanta, Georgia June 10 – 11, this is your chance to get hands-on with your favorite tools, go one-on-one with experts and engineers, and collaborate with our vibrant community of data drivers.

Read more...

timmac Frequent Visitor
Frequent Visitor

PRODUCTS / COMPONENTS INVOLVED

  • Microsoft Power BI 
  • Microsoft Office 365 - Audit Logs
  • PowerShell

 

PROBLEM SCENARIO DESCRIPTION / GOAL

The goal here, is to look for a remote way that one can export the Office 365 Audit Log with a focus on Power BI for a given data range.

 

RESOLUTION ( SCRIPT )

The below script I was able to get to work in my environment.  It is very simple to provide an illustration of how to achieve the given goal.

 

NOTE: The following script is provided as a sample / illustration.

Script Block

Set-ExecutionPolicy RemoteSigned
$UserCredential = Get-Credential
#the link specified here is strictly for use in the powershell and you will need to authenticate to use it
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session
Search-UnifiedAuditLog -StartDate 4/11/2019 -EndDate 5/9/2019 -RecordType PowerBI -ResultSize 1000 | Format-Table | More

 

ADDITIONAL INFORMATION / RESOURCE LINKS

Here is product documentaton on the Search-UnifiedAuditLog PowerShell cmdlet:  https://docs.microsoft.com/en-us/powershell/module/exchange/policy-and-compliance-audit/search-unifi...

 

Amit_PowerBIPro Member
Member

If you are a Power BI Pro user and looking for some support from Microsoft Power BI support team, raise a ticket.

  1. Go to http://support.powerbi.com
    2019-05-03 09_13_55-How to create a support ticket in Power BI - Word.png
  2. Scroll towards the bottom of this Page. Under Still having issues, click CREATE SUPPORT TICKET.
    2019-05-03 09_14_47-How to create a support ticket in Power BI - Word.png

         Note: Pro Users get free support.

  1. Once you click on it, you will be re-directed to https://powerbi.microsoft.com/en-us/support/pro/ 
    2019-05-03 09_16_20-How to create a support ticket in Power BI - Word.png
  2. You can search for the issue. If the issue is already listed, it will display. It will also give you an option to SUBMIT A TICKET
    2019-05-03 09_18_03-How to create a support ticket in Power BI - Word.png

  3.  Click on CONTINUE TO SUPPORT TICKET. You will be redirected to https://powerbi.microsoft.com/en-us/support/pro/ticket/
    2019-05-03 09_19_44-How to create a support ticket in Power BI - Word.png

Once you submit the ticket, you will get an email from support@mail.support.microsoft.com. Someone from the  Microsoft support team will contact you.

jfeil Member
Member

Ashleigh McGovern develops strategy, raises money, and crafts communications for the Center for Oceans at Conservation International (CI). Among other things, she’s responsible for driving innovative solutions with CI's global teams, partners, and donors to improve ocean health.

Spoiler
 
Read more...

jfeil Member
Member

Time to get excited – the full session catalog for Microsoft Business Applications Summit is here. Explore every breakout session and workshop coming to the conference, taking place in Atlanta, Georgia June 10 – 11, 2019. Get ready to flex your skills – and build new ones – with all things Power BI and beyond.

Read more...

Amit_PowerBIPro Member
Member

As per the April 2019 update, Microsoft has introduced a data profiling capability in Power BI desktop.
Data profiling helps us easily find the issues with our imported data from data sources in to Power BI.

Read more...

catdor Frequent Visitor
Frequent Visitor

Recently we announced the capability of creating Machine Learning models directly in PowerBI. This is powered by automated machine learning from Azure. 

 

You can create an ML model based on your data with a few simple steps using this new capability, so this post aims to provide some light on the next steps, model performance and application, and provide a few tips about the reports. 

 

Read more...

ancyphilip Visitor
Visitor

I love using PowerBI for analysis! Recently PowerBI added support for AutoML (Supervised Automated machine Learning) and I wanted to demonstrate the new ML capabilities with an end to end example. In this article, I use AutoML to create and apply a regression model. Regression is a machine learning technique used for understanding the relationship between input and output variables (input is what is known in your data and output is generally the effect of input data and is sometimes unknown). Regression technique gives us more insight into data by making it easier to see relation between dependent and independent variables. These are the five steps I followed in this article to create the Regression model:

2.png

Refer this tutorial  for a detailed walk through of each of  these steps, and this tutorial for ones specific to regression models.

 

Creating a dataflow with the input data

I created a dataflow using the House Sales prediction dataset from Kaggle. It is also available, at the following link: house sales prediction for purposes of this article. It contains sale prices, number of bedrooms, square footage, number of floors, latitude, longitude etc. of houses in King County (which includes Seattle) sold between May 2014 and May 2015. The Price attribute indicating the price of the house is the numeric field used as label for the Regression model. The dataset was split into two entities HousePrices2014 and HousePrices2015 based on the year attribute.

3.png

Note that though most of the attribute data types in this dataset are numeric, these may be interpreted as strings by Power Query. So, most of these fields, including the label, were converted to numeric.

Training a machine learning model

I chose the HousePrices2014 as the entity to apply ML models. Power BI analyzed the Price field and suggested Regression as the type of machine learning model that can be created to predict that field. It took around an impressive 7 minutes to train 12 algorithms on 14 thousand data points.

 

Model validation report

After the training refresh completed, I checked out the training report. Key Influencers shows that Latitude, Sqft_living, Grade are the top predictors for the model.

 

1.png

 

To verify this, I plotted a ft_living (square footage of the home) and price. The plot does indeed show a relationship indicating that that higher square feet living indicates higher prices. As shown in the sqft_living breakdown above, 2.7k to 2.9k sqfeet houses has the highest share of range 1 priced houses (72k to 1M).

 

4.png

 

Likewise, a scatterplot between grade (overall grade given to the housing unit, based on King County grading system) and price shows houses with higher grades does have higher prices.

 

5.png

 

The model is 88% performant, which is a pretty great value for this dataset. As you can see in the chart below, most of the test samples lie in the diagonal showing the low error in prediction.

6.png

7.png

 

The Training Details page shows the performance of the 25 different models, how features were extracted from the inputs, and the hyperparameters for the final model used. AutoML chose an ensemble, Pre fitted Soft Voting Regressor as the best model for this dataset.

 

8.png

9.png

 

Applying the model

As I was satisfied with the training results, I proceeded to apply the model to the HousePrices2015 entity. A HousePrices2015 enriched RegressionModel was created, which includes the predicted output from the model. Overall, it took only around 2 minutes to score nearly 7 thousand rows!

 

Applying a Regression model added two columns with the Regression outcome, and the top record-specific influencers for each prediction. The HousePrices2015 enriched Regression entity with explanation, predicted prices is shown below along with the actual prices in PowerBI Desktop. The local explanations contains break down of how much each feature moved the specific instance away from the average price of houses. For categorical features, it shows how much price would have changed for different values of the categories. In the highlighted example, you can see that the house price was predicted to be $554174. The explanation says that, "condition", caused the price to increase by 44k from the average price of houses, and that the "latitude" caused the price to fall down by 26k.

 

10.png

 

I established a connection to the dataflow from PBI Desktop, to incorporate these predictions from the model in PBI reports so that I can visualize these results. I plotted a heat map of predicted prices using latitude, longitude. Prices were represented by a red (100% gradient stop) through green (50% gradient stop), to yellow(0% gradient stop) gradient.

 

12.PNG

We can see that, for example, Downtown Seattle and Downtown Bellevue have red regions as expected, owing to high prices.

 

13.png

 

I tried a scikit-learn Random Forest on this same dataset. It took 1 min using 3-fold cross validation, searching across 30 different combinations, using all available cores to give an R2 score of 0.85.

 

14.png

Summary

Even though AutoML took longer, I am impressed to have gotten a better R2 score of 0.88 by trying not just Random Forest but 25 different models, in addition to getting instance-level explanations. Moreover, AutoML automatically does preprocessing like dropping features like ID with no useful information and generating additional features for datetime like month, year etc. On the other hand, in scikit-learn one must encode string or date-time like features or drop it.

Overall, I’m very excited about AutoML, because I did not have to learn the intricacies of the ML models nor worry about the implementation and still get the same results. I have done these in Python and R and it took me considerably longer to code up and setup the hyperparameters, whereas in AutoML I am just hitting a bunch of buttons and voila I am done! Given how much easier it is to get started with lesser effort, I encourage business analysts and even data scientists/analysts to try out AutoML!

Helpful resources

Join Blog
Interested in blogging for the community? Let us know.