Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi,
We have a model with email messages, send logs, contacts, and transactional data from sales.
Message:
Message Nom Message ID
webinaire envoie 1 | 1 |
webinaire envoie 2 | 2 |
webinaire envoie 3 | 3 |
webinaire envoie 4 | 4 |
Send Log:
Message ID Sent_date Contact ID
1 | 18 janvier 2024 | 10 |
1 | 18 janvier 2024 | 20 |
1 | 18 janvier 2024 | 30 |
1 | 21 janvier 2024 | 40 |
2 | 9 février 2024 | 10 |
2 | 9 février 2024 | 40 |
2 | 10 février 2024 | 20 |
3 | 1 mars 2024 | 10 |
3 | 16 mars 2024 | 30 |
4 | 1 avril 2024 | 10 |
4 | 1 avril 2024 | 50 |
4 | 3 avril 2024 | 20 |
4 | 5 avril 2024 | 30 |
4 | 10 avril 2024 | 40 |
Contact:
Contact ID Epargnant id
10 | 100 |
20 | 200 |
30 | 300 |
40 | 400 |
50 | 500 |
Transactional:
Epargnant_id date_report transaction_brut ind_adhesion
100 | 25 janvier 2024 | 200 | 1 |
100 | 27 février 2024 | 30 | |
100 | 29 mars 2024 | 50 | |
100 | 15 avril 2024 | 120 | |
200 | 15 février 2024 | 20 | 1 |
200 | 4 mars 2024 | 100 | |
200 | 29 mars 2024 | 90 | |
200 | 18 avril 2024 | 10 | |
300 | 4 février 2024 | 70 | 1 |
300 | 29 février 2024 | 130 | |
300 | 4 mars 2024 | 20 | |
300 | 19 mars 2024 | 90 | |
300 | 4 avril 2024 | 40 | |
400 | 25 janvier 2024 | 20 | 1 |
400 | 27 février 2024 | 130 | |
400 | 12 avril 2024 | 10 | |
500 | 25 avril 2024 | 100 | 1 |
We are looking to show sum of transaction brut from transactional data that has a Date_report that is within 14 days after the Sent_date only.
So our final matrix should look like:
Message | Amount |
Webinaire Envoie 1 | 220 |
Webinaire Envoie 2 | 20 |
Webinaire Envoie 3 | 90 |
Webinaire Envoie 4 | 10 |
Webinaire Envoie 1 would be 220 because client 100 had a 200$ sale between January 18th 2024 and 14 days later --> February 1st and client 400 had 20$ sale between January 21st and February 4th. The sale from client 300 on February 4th should not count since his sent date was January 18th and the 14 day period would end February 1st. All the sales for client 200 also come after the 14 day period of him receiving the email.
We are fully aware that we might have over lapping reporting (webinaire 1 and 3 might be showing the same “amount” from the same client twice).
Hi, @Sibrulotte
You can try the following method to solve the problem.
Steps:
1.Three consecutive Merges
2. Add costom column to calcute the date differ between [Transactional.date_report] and [Send Log.Sent_date]
3. Filter Date differ if the date differ between 0 and 14
4. Remove unneeded columns
5. Back to desktop, use the following measure.
transaction_brut_result =
CALCULATE(SUM(ResultTable[Transactional.transaction_brut]))
Advanced Editor:
let
Source = Table.NestedJoin(Message, {"Message ID"}, #"Send Log", {"Message ID"}, "Send Log", JoinKind.RightOuter),
#"Expanded Send Log" = Table.ExpandTableColumn(Source, "Send Log", {"Message ID", "Sent_date", "Contact ID"}, {"Send Log.Message ID", "Send Log.Sent_date", "Send Log.Contact ID"}),
#"Merged Queries" = Table.NestedJoin(#"Expanded Send Log", {"Send Log.Contact ID"}, Contact, {"Contact ID"}, "Contact", JoinKind.LeftOuter),
#"Expanded Contact" = Table.ExpandTableColumn(#"Merged Queries", "Contact", {"Contact ID", "Epargnant id"}, {"Contact.Contact ID", "Contact.Epargnant id"}),
#"Merged Queries1" = Table.NestedJoin(#"Expanded Contact", {"Contact.Epargnant id"}, Transactional, {"Epargnant_id"}, "Transactional", JoinKind.LeftOuter),
#"Expanded Transactional" = Table.ExpandTableColumn(#"Merged Queries1", "Transactional", {"Epargnant_id", "date_report", "transaction_brut", "ind_adhesion"}, {"Transactional.Epargnant_id", "Transactional.date_report", "Transactional.transaction_brut", "Transactional.ind_adhesion"}),
#"Added Custom" = Table.AddColumn(#"Expanded Transactional", "Custom", each [Transactional.date_report] - [Send Log.Sent_date]),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each [Custom] >= #duration(0, 0, 0, 0) and [Custom] <= #duration(14, 0, 0, 0)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Message ID", "Send Log.Message ID", "Send Log.Sent_date", "Send Log.Contact ID", "Contact.Contact ID", "Contact.Epargnant id", "Transactional.Epargnant_id", "Transactional.date_report", "Transactional.ind_adhesion", "Custom"})
in
#"Removed Columns"
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
How to get your questions answered quickly -- How to provide sample data in the Power BI Forum
Hi @v-yaningy-msft ,
unfortunatly the first steps cannot be done properly. There are more than one sent date per transaction.
That will multiply my transaction table like crazy. Not doable.
Hi, @Sibrulotte
You can share the pbix file without sensitive data for testing by Google Drive.
Best Regards,
Yang
Community Support Team
It's very unfortunate but my organisation does not allow it.
Here is a sharefile link instead:
https://fondsftq.sharefile.com/d-s776bd94338064f10ad8e262d593f09f6
Hi, @Sibrulotte
The link you shared requires a login to personal account in order to open it, which is not allowed due to security and privacy policy considerations, are you able to share links that can be opened without logging into personal account? (Google Drive can do this)
Best Regards,
Yang
Community Support Team
I'd love to share the PBIX file, but not sure of the best way.
Hi my message was marked as spam and is just getting posted, so I'm bringing it back to the top for your attention. Thank you 🙂
User | Count |
---|---|
84 | |
80 | |
72 | |
71 | |
55 |
User | Count |
---|---|
108 | |
106 | |
93 | |
85 | |
67 |