Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Optimize r script in Power BI

I have an R script that pulls from Cisco Prime API, that when ran in R Consol, takes 5 min to run (still not ideal) and pulls over 3000 records. But when I put in in Power BI, it takes nearly 10 mins to even get 100 records and errors out when I try to do the whole script. The script is below. Any suggestions would be greatly appreciated. I was thinking multithreading but even then I feel like Id still be way too far off. Let it be noted that this is my first time using R in any sort of complex fashion so any other code suggestions are welcome as well. Thank you!

 

Note: In Microsoft R it takes around 20-25 sec per 500 results pulled

 

require("httr")
require("jsonlite")

require("foreach")

username <- "account"
password <- "Password"


#Get total device count
main_url <- "https://server/webacs/api/v3/data/InventoryDetails?.firstResult=0&.maxResults=500&.full=true"
get_genstats <- GET(main_url, authenticate(username,password, type = "basic"))
get_genstats_text <- content(get_genstats, "text")
get_genstats_json <- fromJSON(get_genstats_text, flatten = TRUE)
device_count <- get_genstats_json$queryResponse$'@count'

#Initialize Master dataframe
master_df <- as.data.frame(get_genstats_json)
#set i to 500 since the master dataframe pulled the first 500 rows
i <- 500
#Must page due to API limitations
while (i <= device_count - 1)
{
paging_url <- paste0("https://server/webacs/api/v3/data/InventoryDetails?.firstResult=",i,"&.maxResults=500&.full=true")

get_devices <- GET(paging_url, authenticate(username,password, type = "basic"))
get_devices_text <- content(get_devices, "text")
get_devices_json <- fromJSON(get_devices_text, flatten = TRUE)
i <- i + 500
get_devices_df <- as.data.frame(get_devices_json)
master_df <- rbind(master_df, get_devices_df)
}

#Renames column names to be more readable
foreach(i=1:ncol(master_df))%do%
{
cur_col_name <- colnames(master_df)[i]

#Get Char positions of dots
char_pos <- rev(gregexpr("\\.", cur_col_name)[[1]])

#get last dot position
char_pos <- char_pos[1]

#Avoide Duplicate Column Names
if(toString(substr(cur_col_name,char_pos+1, nchar(cur_col_name)))%in%colnames(master_df)== FALSE){
colnames(master_df)[i] <- substr(cur_col_name,char_pos+1, nchar(cur_col_name))
}

else{
colnames(master_df)[i] <- paste(substr(cur_col_name,char_pos+1, nchar(cur_col_name)),"2")
}

#Convert type from list to character string for Power BI processing
if(sapply(master_df,class)[i] == "list"){
master_df[,i] <- as.character(master_df[,i])
}
}

 

 

2 REPLIES 2
v-lili6-msft
Community Support
Community Support

hi, @Anonymous

      You may try to do these ways as below:

1. delete code about "#Renames column names to be more readable" and do these steps in power bi.

2. If data source is too large, you may try to use other way to transfer data, e.g. SQL database...

 

Best Regards,

Lin

     

Community Support Team _ Lin
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Anonymous
Not applicable

Thank you for your post. I tried cutting that part out and it still doesnt load. In R that reduces the runtime by about 2 min. But, in the end I still need to do something to change my lists to chars in order to load it into Power BI. Unfortunantly Cisco doesnt allow for direct database access and you are forced to use an API which is pretty lame. 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.