Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi all,
Ive built a simple scraper which pulls pricing data instead of copy/pasting it manually. Currently each Query is pointed to a product URL by changing the "source" of the query. Is there a way that I can create a table of URLs and have one query look at each of them and extract the Source code data, instead of having separate Queries for each product page and appending them?
It would save me considerable copy/paste time if I can acheve this.
Thanks
Hi @Anonymous
I assume you have something like this.
let
Source = Web.BrowserContents("https:/abc.com/?page=12"),
//Your addtional tranfomation steps goes below
.
...
....
#"LastStep" = ....
in
#"LastStep"
Then you have other URLs for which you need to apply same steps. In that case follow below steps.
(url as text) =>
let
Source = Web.BrowserContents(url), // Replace the hardcoded url with parameter url
//Your addtional tranfomation steps goes below
.
...
....
#"LastStep" = ....
in
#"LastStep"
Thats all you need I hope.
Appreciate with kudos by clicking the like button on bottom right.
Please mark as a solution if this solves your problem.
Thanks
Hi @Anonymous ,
as @Anonymous said.
But if you want to regularly refresh the results in Power BI service, you have to move the dynamic URLs into the query parameters instead. Otherwise you'll get an error complaining about dynamic data sources: https://www.thebiccountant.com/2018/03/22/web-scraping-2-scrape-multiple-pages-power-bi-power-query/
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
Hi @Anonymous you can create a power query function a Table with URLS and iterate over those URLS to get the data.
Thanks.