cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
jader3rd Regular Visitor
Regular Visitor

Power BI Service failes refreshing Azure Data Lake Store data source with not being able to covert

I have an Azure Data Lake Store data source. I can refresh it from the Power BI Desktop. After I publish refreshing the data source from the Service fails with "We cannot convert the value null to type Logical."

Other threads indicate that the problem is that the DataLake.Contents() method should have null as a second parameter instead of []. But the Power BI Desktop didn't create the M query to have [] as the second parameter is has [PageSize=null].

So it looks like 

Source = DataLake.Contents("adl://<Path to Folder>/", [PageSize=null]),

1 ACCEPTED SOLUTION

Accepted Solutions
jader3rd Regular Visitor
Regular Visitor

Re: Power BI Service failes refreshing Azure Data Lake Store data source with not being able to cove

There were two things that needed to happen to get this to work.

First, there needed to be a service account without two factor auth that's used as the credentials for the data set.

Second I had to change the M query from what the Power BI desktop created

let
    Source = DataLake.Contents("adl://<full path>", [PageSize=null]),
    #"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
    #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from Query1 (2)", each #"Transform File from Query1 (2)"([Content])),
    #"Removed Other Columns1" = Table.SelectColumns(#"Invoke Custom Function1", {"Transform File from Query1 (2)"}),
    #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Query1 (2)", Table.ColumnNames(#"Transform File from Query1 (2)"(#"Sample File (2)"))),
    #"Changed Type" = Table.TransformColumnTypes(#"Expanded Table Column1",{{"Column1", type text}, {"Column2", type text}, {"Column3", type datetime}, {"Column4", type number}, {"Column5", type number}, {"Column6", type datetime}, {"Column7", Int64.Type}})
in
    #"Changed Type"

to

let
    Source = DataLake.Contents("adl://<path to folder>"),
    #"File1" = Source{[Name="<file>.tab"]}[Content],
    #"Imported CSV" = Csv.Document(File1,[Delimiter="#(tab)", Encoding=1252]),
    #"Changed Type" = Table.TransformColumnTypes(#"Imported CSV",{{"Column3", type datetime}, {"Column4", type number}, {"Column5", type number}, {"Column6", type datetime}, {"Column7", Int64.Type}})
in
    #"Changed Type"
3 REPLIES 3
Community Support Team
Community Support Team

Re: Power BI Service failes refreshing Azure Data Lake Store data source with not being able to cove

Hi @jader3rd,

 

The error message show that it can't convert null valu.

 

Please check if you have null value in your data. Perhaps you can try to replace these null value to a default value before your operation.

 

In addition, could you refresh the report successfully in Power BI Desktop?

 

Best  Regards,

Cherry

Community Support Team _ Cherry Gao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
jader3rd Regular Visitor
Regular Visitor

Re: Power BI Service failes refreshing Azure Data Lake Store data source with not being able to cove

There are no null values. I'm currently experimenting with a file that only has two rows.

Yes, refreshing from the Desktop works.

jader3rd Regular Visitor
Regular Visitor

Re: Power BI Service failes refreshing Azure Data Lake Store data source with not being able to cove

There were two things that needed to happen to get this to work.

First, there needed to be a service account without two factor auth that's used as the credentials for the data set.

Second I had to change the M query from what the Power BI desktop created

let
    Source = DataLake.Contents("adl://<full path>", [PageSize=null]),
    #"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
    #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from Query1 (2)", each #"Transform File from Query1 (2)"([Content])),
    #"Removed Other Columns1" = Table.SelectColumns(#"Invoke Custom Function1", {"Transform File from Query1 (2)"}),
    #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Query1 (2)", Table.ColumnNames(#"Transform File from Query1 (2)"(#"Sample File (2)"))),
    #"Changed Type" = Table.TransformColumnTypes(#"Expanded Table Column1",{{"Column1", type text}, {"Column2", type text}, {"Column3", type datetime}, {"Column4", type number}, {"Column5", type number}, {"Column6", type datetime}, {"Column7", Int64.Type}})
in
    #"Changed Type"

to

let
    Source = DataLake.Contents("adl://<path to folder>"),
    #"File1" = Source{[Name="<file>.tab"]}[Content],
    #"Imported CSV" = Csv.Document(File1,[Delimiter="#(tab)", Encoding=1252]),
    #"Changed Type" = Table.TransformColumnTypes(#"Imported CSV",{{"Column3", type datetime}, {"Column4", type number}, {"Column5", type number}, {"Column6", type datetime}, {"Column7", Int64.Type}})
in
    #"Changed Type"