Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
smo
Frequent Visitor

Qliksense to Power Bi pick(wildMatch

Hello 

I am trying to move from qliksense to power bi . I am looking an alternative to the pick(wildmatch(4010,40??,50??,30*),1,2,3) in qliksese. 

With this formula  I am able to use wild card mapping  for example the result o n the following formula will be pick(wildmatch(4010,40??,50??,30*),1,2,3)  =1.  As the 4010 is in the first grup so the first element of the second array is selected. .

Is there anything in power bi that will work like this. 

In Vba  i was able to do this with an inverse like .

 

For each member in teh array 

 

if 4010  like Member then 

 Result 

else 

 

end if

 

I will apreciate if you could help me 

 

regards

 

 

 

6 REPLIES 6
v-lili6-msft
Community Support
Community Support

hi, @smo 

You could use SEARCH Function for pick and nested IF or SWITCH Function to for wildmatch to create a formula as your logic.

for example:

New Column 2 = SEARCH( SWITCH( TRUE(),
SEARCH("40",[Column],1,0)>0,1,
SEARCH("50",[Column],1,0)>0,2,
SEARCH("30",[Column],1,0)>0,3) , "123",1,0)

Result:

2.JPG

and here is sample pbix file, please try it.

 

Regards,

Lin

Community Support Team _ Lin
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hello Thansk for the response  I manage to use the following DAX formula to get thing working 

PB_Account_F9991 =
VAR Matches =
Topn(1,
CALCULATETABLE(
GENERATE(
'SUN_YTD',
FILTER(
ACCOUNT_F9991,
SEARCH(
[ACCOUNT_F9991_EXTERNAL],
SUN_YTD[KeyAccount],
1,
0
)
>0)
))
)
RETURN

CONCATENATEX(
Matches,
[ACCOUNT_F9991_INTERNAL],",")

 

But I would like to move this to M . As the function take to much time in Dax and I need to refer other calculated fild on this 

IS there any way to move this to Power Query? 
IS there a any way to use the result query including the calculated columns from other Power Bi importing to be able to modified with Power query ?
Regards
Sofia 

hi, @smo 

It is difficult to do it in power query, we may try to optimize the formula, could you share some sample data and your expected output for us have a test?

 

Regards,

Lin

Community Support Team _ Lin
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hello   

 

I am trying to do this with R at the moment 
I have a table with  with 2 columns. WIth a big list of option I will put 4 examples but is longer 

WildCardColumnMapping 
*_I*_???_0078700000001004111
*_???_0078700000001004112
*_I?*_???_0063300000000055111
*_I?*_???_0063300000?????5116
  

 

My other table with the data 

 

EntityAccountCounterpatyAmount
B22007870000000100IR222000
B22007870000000100T2224000
B22006330000000005R2224500
B22006330000000007IR225000

 

I want it as result 

 

EntityAccountCounterpatyAmountResult 
B22007870000000100IR2220004111
B22007870000000100T22240004112
B22006330000000005R22245005111
B22006330000000007IR2250005116

 

 

I am thinking of using R

 

Toconvert the  wildcard to regular exprecions I am using bellow formula 

 

outputRegEx <-within (dataset, {RegEx_WildCardColumn= glob2rx(dataset$WildCardColumn)})

 

I am able to add a column with the pattern in bellow way 

library("stringr")
pattern <-"^.*_???_0062102040.....$"
cleanRegEx <- str_extract(dataset$KeyAccount, pattern)
outputRegEx <-within (dataset, {RegEx= cleanRegEx})

My idea is to maybe loop thorught the wild car table getting the regular expresion and filtering the data table forming a new data frame with the new column 

Could you guide me on this?

 

Regards

smo
Frequent Visitor

Hello 


I do not know if is the best solution from performance point of view but I solve the issue using bellow R code 
First in the mapping table I apply the following R funtion to transform the wild cards into Regular exprecions

# 'dataset' holds the input data for this script
outputRegEx <-within (dataset, {RegEx_ACCOUNT_F9991_EXTERNAL= glob2rx(dataset$ACCOUNT_F9991_EXTERNAL)})

 


Then on the table with the data I use the following code where I filter by the mapping table taking the regular exprecions and filtering the data table once is filter I add the code on the second column. I reduce the data frame to not get duplicate mapping. And I add the data frame that now matching has been encounter. 

Hope this help other people with the same issue . 

# 'dataset' holds the input data for this script
library("stringr")

dataset<- dataset
dataset2 <- dataset2

 

subData <- data.frame(
ENTITY=character(),
KeyAccount=character(),
BCP_ACCOUNT=character() )
for (i in 1:nrow(dataset2)) {
patternLook <-dataset2[i,1]
patternResult <-dataset2[i,2]

isValid <- function(x) {grepl(patternLook , as.character(x), ignore.case=TRUE)}
subData <- rbind(subData ,subset(within(dataset,{BCP_ACCOUNT =toString(patternResult)}),isValid(dataset$KeyAccount)=="TRUE"))
dataset<- dataset[ !(dataset$KeyAccount%in% subset(dataset,isValid(dataset$KeyAccount)=="TRUE")$KeyAccount), ]
}
subData <- rbind(subData ,within(dataset,{BCP_ACCOUNT ="NA"}))

 

 

It is important to add the data frame of the second column next to the other data frame  just add a column and add the second data frame [dataset=#"Added Custom", dataset2=ACCOUNT_F9991].

Make sure that is in the same run R script 

 

 

amitchandak
Super User
Super User

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.