Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Syndicate_Admin
Administrator
Administrator

Buscar si se producen varios valores antes que otras columnas

¡Hola de nuevo!

Tengo este conjunto de datos con una serie de intervenciones y una fecha para cada intervención:

INDXINT 1FECHA INT 1INT 2FECHA INT 2INT 3FECHA INT 3INT 4FECHA INT 4INT 5FECHA INT 5INT 6FECHA INT 6
1R06/01/2022T08/04/2022U04/08/2022O01/09/2022P06/04/2022Xx05/01/2022
2Q17/01/2022T06/01/2022B01/01/2022
3Ll05/06/2022Xx02/01/2022Jj04/01/2022Yy02/01/2022
4Xx04/01/2022U S04/01/2022Q05/01/2022Tt10/01/2022
5Hh04/01/2022Ss09/04/2022AA17/01/2022Vv05/09/2022
6Pp09/01/2022Ll06/01/2022Yy03/01/2022Hh04/05/2022Xx04/01/2022Xx07/01/2022
7Ss03/01/2022Oo04/01/2022Xx05/01/2022
8Yy02/01/2022Xx01/01/2022AA03/01/2022
9Yy05/01/2022Jj01/01/2022Yy06/01/2022Ll04/09/2022

Quiero obtener 1 si XX O YY ocurren antes de cualquier otra intervención. Así, por ejemplo, en la primera fila devolveré un 1 y en la fila 7 no devolverá nada.

Si XX y YY ocurren juntos antes que cualquier otra cosa (como en la fila 3) está bien obtener un 1; si XX o YY ocurren el primer día pero togeter con algo (como en la fila 4) de lo contrario me gustaría no obtener nada.

¡Gracias de antemano por la ayuda!

1 REPLY 1
Syndicate_Admin
Administrator
Administrator

Esos datos de origen están en bastante mal estado. El primer paso es despivoarlo en algo utilizable

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZLLCoMwEEV/pWQt5GESzdIuRERQSxVF/P/faKbGIQ8tXQTiwZzJ3Mm2EU4y8rKLaco4FUwI+/EGUFImTzABkNQyB3oAnDJzgsE58MiyAFFo3bONAB/t4kVczK/+dG4Ej8sFwtzuus4V0mFp4Rna1jWAZF3jf1K7RJd/cppiMoad2pagJ85+ypXdNU2sqmsgxsuxquLA5tnVM/d2DSMZnAtPHlHpJIbcI3gnFcbp3/IgRTDbAm/v2/r++qz6a7zl5aAOg/9AvhkFde+dBp0qeSA8SUYn6cn73PcP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [INDX = _t, #"INT 1" = _t, #"DATE INT 1" = _t, #"INT 2" = _t, #"DATE INT 2" = _t, #"INT 3" = _t, #"DATE INT 3" = _t, #"INT 4" = _t, #"DATE INT 4" = _t, #"INT 5" = _t, #"DATE INT 5" = _t, #"INT 6" = _t, #"DATE INT 6" = _t]),
    CT = (List.Count(Table.ColumnNames(Source))-1)/2,
    #"Merged Columns1" = List.Accumulate({1..CT},Source,(state,current)=>Table.CombineColumns(state,{"INT " & Text.From(current), "DATE INT " & Text.From(current)},Combiner.CombineTextByDelimiter(",", QuoteStyle.None),"Merged " & Text.From(current))),
    #"Unpivoted Other Columns" = Table.UnpivotOtherColumns(#"Merged Columns1", {"INDX"}, "Attribute", "Value"),
    #"Filtered Rows" = Table.SelectRows(#"Unpivoted Other Columns", each ([Value] <> " , ")),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Filtered Rows", "Value", Splitter.SplitTextByDelimiter(",", QuoteStyle.Csv), {"INT", "Date"}),
    #"Changed Type with Locale" = Table.TransformColumnTypes(#"Split Column by Delimiter", {{"Date", type date}}, "en-GB"),
    #"Changed Type" = Table.TransformColumnTypes(#"Changed Type with Locale",{{"INDX", Int64.Type}}),
    #"Removed Other Columns" = Table.SelectColumns(#"Changed Type",{"INDX", "INT", "Date"})
in
    #"Removed Other Columns"

Después de eso, puede crear una medida que implemente la lógica deseada.

XX or YY is first Int = 
var a = min('Table'[Date])
var i = CALCULATE(min('Table'[INT]),'Table'[Date]=a)
return if (i in {"XX","YY"},1,0)

lbendlin_0-1661119124476.png

ver adjunto.

Gracias por el desafío: la parte que no se dividía era complicada, ya que necesitaba pares de columnas.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.