Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Hi Everyone,
Context:
My company has two chains of stores and ships ordres from 3 differents warehouse to customers (nothing unsual so far). Each warehouse has a perimeter of postcode. The difficulty is this perimeter changes depending on the store chain and the date.
What i want to do:
I want to countfor each warehouse the number of sales ordres that have been shipped to postcodes inside its perimeter.
What i have:
4 tables - with the data and data model below
result i'm expecting:
below the intermediate result (to understand what i'm trying to do)
sales ID | Shipped from correct warehouse? | should have been shipped from |
S001 | TRUE | |
S002 | TRUE | |
S003 | FALSE | W3 |
S004 | TRUE | |
S005 | TRUE | |
S006 | TRUE | |
S007 | FALSE | W2 |
S008 | TRUE | |
S009 | FALSE | W3 |
S010 | TRUE |
Final result :
Warehouse | number of orders shipped in correct perimeter |
W1 | 5 |
W2 | 0 |
W3 | 3 |
I was thinking to use a CALCULATE and COUNT with a RELATED but I found out that it wasn't possible to use RELATED with a many-many relationship...
i came up with something like this:
sales orders OK =
CALCULATE (
COUNTROWS ( Sales ),
RELATED ( Postcode_Warehouse[Postcode] ) = Sales[destination postcode],
RELATED ( Postcode_Warehouse[Warehouse attached] ) = Sales[Shipped from],
RELATED ( Postcode_Warehouse[store] ) = Sales[Store],
RELATED ( Postcode_Warehouse[1st date in use] ) >= Sales[date of shippement],
RELATED ( Postcode_Warehouse[last date in use] ) < Sales[date of shippement]
)
I thought of another solution which was to create a table with postcode unique values to try to by pass that many many problem but it still didn't work out...
I hope i made myself clear...
Anyway thanks for helping me out on this issue.. i've been on it for quite some time and i'm really hoping someone can help me. 🤞
below the data for the 2 main tables :
sales
sales ID | Shipped from | date of shippement | destination postcode | store |
S001 | W1 | 04/03/2020 | 100 | store 1 |
S002 | W1 | 05/02/2021 | 100 | store 1 |
S003 | W1 | 06/06/2020 | 400 | store 2 |
S004 | W3 | 13/12/2020 | 500 | store 2 |
S005 | W3 | 06/01/2021 | 400 | store 2 |
S006 | W1 | 15/10/2020 | 200 | store 1 |
S007 | W1 | 02/03/2021 | 200 | store 1 |
S008 | W1 | 24/06/2020 | 200 | store 1 |
S009 | W2 | 13/01/2021 | 400 | store 2 |
S010 | W3 | 18/07/2020 | 500 | store 1 |
Postcode_warehouse
Postcode | Warehouse attached | store | 1st date in use | last date in use |
100 | W1 | store 1 | 01/01/2020 | 31/12/2020 |
100 | W1 | store 2 | 01/01/2020 | 31/12/2020 |
200 | W1 | store 1 | 01/01/2020 | 31/12/2020 |
200 | W1 | store 2 | 01/01/2020 | 31/12/2020 |
300 | W2 | store 1 | 01/01/2020 | 31/12/2020 |
300 | W2 | store 2 | 01/01/2020 | 31/12/2020 |
400 | W2 | store 1 | 01/01/2020 | 31/12/2020 |
400 | W3 | store 2 | 01/01/2020 | 31/12/2020 |
500 | W3 | store 1 | 01/01/2020 | 31/12/2020 |
500 | W3 | store 2 | 01/01/2020 | 31/12/2020 |
100 | W1 | store 1 | 01/01/2021 | |
100 | W1 | store 2 | 01/01/2021 | |
200 | W2 | store 1 | 01/01/2021 | |
200 | W1 | store 2 | 01/01/2021 | |
300 | W2 | store 1 | 01/01/2021 | |
300 | W2 | store 2 | 01/01/2021 | |
400 | W2 | store 1 | 01/01/2021 | |
400 | W3 | store 2 | 01/01/2021 | |
500 | W3 | store 1 | 01/01/2021 | |
500 | W3 | store 2 | 01/01/2021 |
@Charles_Tex , First of all, is postcode and store is unique combination is Postcode_warehouse ?
if so create a concatenated key
key = [postcode] & " " +[store]
In sales a new column
key = [destination postcode] & " " & [store]
Can this join be 1-M(sales)
do not join store_chian and Postcode_warehouse , i doubt that will be needed
Amitchandak, thanks for your answer.
Unfortunately the combination postcode & store are not unique values in postcode_warehouse, as there is also date columns, therefore the combination can occur on different periods of time...
If i follow your logic i shoud do a key = postcode & store & date in postcode_warehouse to have a unique value. Yet i have a start date & end date for this table, so the date isn't relevant it's the period between those two dates.....
User | Count |
---|---|
140 | |
113 | |
104 | |
77 | |
65 |
User | Count |
---|---|
136 | |
118 | |
101 | |
71 | |
61 |