Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
I'm building a Power BI report to track stock and inventory levels for various items across different locations. My data source is the item ledger entry table from Navision stored in SQL Server. This table captures item movements for each date and location.
I've imported the data using the following SQL query:
WITH day_quantity AS (
SELECT
il.[Posting Date] as posting_date,
il.[Item No_] as item_no,
il.[Location Code] as location_code,
SUM(il.[Quantity]) as quantity
FROM [COG$Item Ledger Entry] AS il
GROUP BY il.[Posting Date], il.[Item No_], il.[Location Code]
),
SELECT
posting_date,
item_no,
location_code,
SUM(quantity) OVER (PARTITION BY item_no, location_code ORDER BY posting_date
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW) AS stock
FROM
day_quantity
This query calculates the stock quantity for each item and location on days with movements. However, it doesn't provide a complete picture of the actual stock for days without transactions.
My questions are:
- Is importing a view from SQL Server with a structure like "date, item, location, quantity" (one line per unique combination) a suitable approach? While I don't need a record for items with zero quantity, I'm concerned about the potential data volume. It will be the best option for me because I see how to calculate variations but I didn't manage to write the right SQL query.
- Can I use DAX measures to calculate the daily stock for all items and locations, even on days without transactions? I'm relatively new to DAX and would appreciate guidance on creating such a measure.
Hi @v-kongfanf-msft ,
Here an example of the item ledger entries (from the SQL database):
Item_no | date | transfer_type | location_code | quantity |
FMB-23-00 | 08/05/2024 | sales | ENA | -150 |
FMB-23-00 | 08/05/2024 | purchases | ENA | 200 |
LK-23-00 | 08/05/2024 | sales | DOM | -60 |
FMB-23-00 | 07/05/2024 | sales | BUD | -98 |
LOB-23-00 | 07/05/2024 | transfer | DOM | 75 |
LOB-23-00 | 07/05/2024 | transfer | BUD | -75 |
FMB-23-00 | 07/05/2024 | positive adjustment | BUD | 40 |
FMB-23-00 | 07/05/2024 | sales | DOM | -600 |
FMB-23-00 | 06/05/2024 | purchase | DOM | 796 |
FMB-23-00 | 06/05/2024 | purchase | DOM | 755 |
LK-23-00 | 06/05/2024 | purchase | DOM | 300 |
LOB-23-00 | 06/05/2024 | purchase | BUD | 75 |
From my SQL request, I get this table with the right stock of the item but only when there was a movement:
Item_no | date | location_code | quantity |
FMB-23-00 | 08/05/2024 | ENA | -28 |
LK-23-00 | 08/05/2024 | DOM | 240 |
FMB-23-00 | 07/05/2024 | BUD | -58 |
LOB-23-00 | 07/05/2024 | DOM | 75 |
LOB-23-00 | 07/05/2024 | BUD | 0 |
FMB-23-00 | 07/05/2024 | DOM | 951 |
FMB-23-00 | 06/05/2024 | DOM | 1551 |
LK-23-00 | 06/05/2024 | DOM | 300 |
LOB-23-00 | 06/05/2024 | BUD | 75 |
In fine, I would like this result (i do not need the lines where the quantity is 0):
Item_no | date | location_code | quantity |
FMB-23-00 | 08/05/2024 | ENA | -28 |
FMB-23-00 | 08/05/2024 | BUD | -58 |
FMB-23-00 | 08/05/2024 | DOM | 951 |
LK-23-00 | 08/05/2024 | DOM | 240 |
LOB-23-00 | 08/05/2024 | DOM | 75 |
LOB-23-00 | 08/05/2024 | BUD | 0 |
FMB-23-00 | 07/05/2024 | ENA | 0 |
FMB-23-00 | 07/05/2024 | BUD | 0 |
FMB-23-00 | 07/05/2024 | DOM | 1551 |
LK-23-00 | 07/05/2024 | DOM | 300 |
LOB-23-00 | 07/05/2024 | DOM | 0 |
LOB-23-00 | 07/05/2024 | BUD | 75 |
FMB-23-00 | 06/05/2024 | ENA | 0 |
FMB-23-00 | 06/05/2024 | BUD | 0 |
FMB-23-00 | 06/05/2024 | DOM | 1551 |
LK-23-00 | 06/05/2024 | DOM | 300 |
LOB-23-00 | 06/05/2024 | DOM | 0 |
LOB-23-00 | 06/05/2024 | BUD | 75 |
Is it clearer like this?
Thanks
Marie
Hi @marie_joy ,
Please provide a screenshot of the desired result and describe it, it would be great if you have relevant test data about the data model. It is convenient for me to answer your question as soon as possible.
Best Regards,
Adamk Kong
User | Count |
---|---|
54 | |
48 | |
19 | |
16 | |
15 |
User | Count |
---|---|
115 | |
42 | |
41 | |
27 | |
22 |