Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Hi,
I'm using power bi desktop to connect to web services using the from web query. Everything works well before i try to merge queries. I web service I connect to only allows 1 call per 2 second so I think i hit that limitation. Is there a way to introduce a delay to get around this?
Ludvig
Solved! Go to Solution.
Thanks, downloaded Fiddler and it looks like power bi makes a new http call for more or less every operation. Have only taken a couple of programming classes 10 years ago but that would not even been aloud in 101 classes!
I have tried Table.Buffer but it didn't help much in my more complex queries. I guess I will try to keep the queries to raw data and do all calucaltions etc using the modelling functionality and see if that helpes.
Thanks again Chris!
Estou enfrentando o mesmo problema quando preciso ter uma delay na horas de fazer uma requisacao de minha api
let
Fonte = Json.Document(Web.Contents("https://api.4yousee.com.br/v1/playlists/45", [Headers=[#"Secret-Token"="#(tab)xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"]])),
items = Fonte[items],
#"Convertido para Tabela" = Table.FromList(items, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Column1 Expandido" = Table.ExpandRecordColumn(#"Convertido para Tabela", "Column1", {"type", "id", "name", "items", "sequence"}, {"Column1.type", "Column1.id", "Column1.name", "Column1.items", "Column1.sequence"}),
#"Column1.items Expandido" = Table.ExpandListColumn(#"Column1 Expandido", "Column1.items"),
#"Column1.items Expandido1" = Table.ExpandRecordColumn(#"Column1.items Expandido", "Column1.items", {"type", "id", "name", "file", "durationInSeconds", "categories", "contentSchedule"}, {"Column1.items.type", "Column1.items.id", "Column1.items.name", "Column1.items.file", "Column1.items.durationInSeconds", "Column1.items.categories", "Column1.items.contentSchedule"}),
#"Column1.items.contentSchedule Expandido" = Table.ExpandRecordColumn(#"Column1.items Expandido1", "Column1.items.contentSchedule", {"endDate"}, {"Column1.items.contentSchedule.endDate"}),
#"Colunas Removidas" = Table.RemoveColumns(#"Column1.items.contentSchedule Expandido",{"Column1.type", "Column1.id", "Column1.items.type", "Column1.items.id", "Column1.items.file", "Column1.items.durationInSeconds", "Column1.items.categories", "Column1.sequence"}),
#"Valor Substituído" = Table.ReplaceValue(#"Colunas Removidas",null,"11/05/2045",Replacer.ReplaceValue,{"Column1.items.contentSchedule.endDate"}),
#"Tipo Alterado" = Table.TransformColumnTypes(#"Valor Substituído",{{"Column1.items.contentSchedule.endDate", type date}}),
#"Linhas Agrupadas" = Table.Group(#"Tipo Alterado", {"Column1.name"}, {{"Contagem", each List.Max([Column1.items.contentSchedule.endDate]), type nullable date}}),
#"Colunas Renomeadas" = Table.RenameColumns(#"Linhas Agrupadas",{{"Column1.name", "IANDE"}, {"Contagem", "DATA"}})
in
#"Colunas Renomeadas"
i have this function:
(Query as text, ResultType as text) => if ResultType = "record" then
let
source = OData.Feed("https://api.security.microsoft.com/api/" & Query, null, [Implementation="2.0", MoreColumns=true]),
#"toList" = {source},
#"toRecord" = Record.FromList(#"toList", {"Record"})
in
#"toRecord"
else
let
source = OData.Feed("https://api.security.microsoft.com/api/" & Query, null, [Implementation="2.0", MoreColumns=true]),
#"toTable" = #table(1, {{source}})
in
#"toTable"
i call this function :
let
ODataSource = customODataQuery("Machines","table"),
Column1 = ODataSource{0}[Column1],
#"Added Custom" = Table.AddColumn(Column1, "OnboardedDate", each [firstSeen]),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom",{{"OnboardedDate", type date}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"GetMissingKbs", "LatestMachineActions", "AvailableMachineActions"}),
#"Expanded vulnerabilities" = Table.ExpandTableColumn(#"Removed Columns", "vulnerabilities", {"id", "name", "description", "severity", "cvssV3", "exposedMachines", "publishedOn", "updatedOn", "publicExploit", "exploitVerified", "exploitInKit", "exploitTypes", "exploitUris", "More Columns"}, {"id.1", "name", "description", "severity", "cvssV3", "exposedMachines", "publishedOn", "updatedOn", "publicExploit", "exploitVerified", "exploitInKit", "exploitTypes", "exploitUris", "More Columns"})
in
#"Expanded vulnerabilities"
Problem
when i apply to save the cahanges, i run into API call limit. Can someone help please?
@lsuneson No expert in API calls but this might help you - also has a reference to an old similar post
https://blog.crossjoin.co.uk/2015/04/30/using-function-invokeafter-in-power-query/
thanks @konstantinos. I have actually seen that post but I couldn't figure out how to use Function.InvokeAfter().
Below is simple exampel of my query, if anyone can give some advice, it woudl be very appriciated.
let
Source = Json.Document(Web.Contents("https://api.skulabs.com/purchase_order/get?limit=100&skip=&sort=")),
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"number", "distributor_id"}, {"Column1.number", "Column1.distributor_id"}),
#"Merged Queries" = Table.NestedJoin(#"Expanded Column1",{"Column1.distributor_id"},MapTable, {"Column1._id"},"NewColumn",JoinKind.LeftOuter)
in
#"Merged Queries"
Try something like this:
let Source = ()=> Json.Document(Web.Contents("https://api.skulabs.com/purchase_order/get?limit=100&skip=&sort=")), #"Converted to Table" = Table.FromList(Function.InvokeAfter(Source, #duration(0,0,0,2)), Splitter.SplitByNothing(), null, null, ExtraValues.Error), #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"number", "distributor_id"}, {"Column1.number", "Column1.distributor_id"}), #"Merged Queries" = Table.NestedJoin(#"Expanded Column1",{"Column1.distributor_id"},MapTable, {"Column1._id"},"NewColumn",JoinKind.LeftOuter) in #"Merged Queries"
This adds a two second delay before calling the web service; the length of the delay is determined by #duration(0,0,0,2) in the "Converted to Table" step.
HTH,
Chris
Hi @cwebb, thanks a lot for your help, it's very appreciated.
I think source is of type list and InvoiceAfter needs a function as parameter because I'm getting the following error:
Expression.Error: We cannot convert a value of type List to type Function.
Details:
Value=List
Type=Type
Any advice on how to fix this?
I also made a change to the Source step in my example that means it should return a function, so maybe something else is going on here. Can you post the code that gives you this error?
Thanks,
Chris
Sorry, I missed that. It works well now, thank you so much!
Is it away to trace the calls being made? I seem to have a loop over the http get call as this is now very slow..
Are merge or any other functions causing loops/iterations over the calls?
There's no easy way of tracing this in Power BI; you could use a tool like Fiddler though.
It is possible that multiple calls are being made. It might be worth experimenting with Table.Buffer() here (see https://blog.crossjoin.co.uk/2015/05/05/improving-power-query-calculation-performance-with-list-buff... for some background). Maybe something like this would work:
let Source = ()=> Json.Document(Web.Contents("https://api.skulabs.com/purchase_order/get?limit=100&skip=&sort=")), #"Converted to Table" = Table.FromList(Function.InvokeAfter(Source, #duration(0,0,0,2)), Splitter.SplitByNothing(), null, null, ExtraValues.Error), #"Expanded Column1" = Table.Buffer(Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"number", "distributor_id"}, {"Column1.number", "Column1.distributor_id"})), #"Merged Queries" = Table.NestedJoin(#"Expanded Column1",{"Column1.distributor_id"},MapTable, {"Column1._id"},"NewColumn",JoinKind.LeftOuter) in #"Merged Queries"
Chris
Thanks, downloaded Fiddler and it looks like power bi makes a new http call for more or less every operation. Have only taken a couple of programming classes 10 years ago but that would not even been aloud in 101 classes!
I have tried Table.Buffer but it didn't help much in my more complex queries. I guess I will try to keep the queries to raw data and do all calucaltions etc using the modelling functionality and see if that helpes.
Thanks again Chris!