Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
ReyCarter
Helper I
Helper I

DAX: cyclic dependency detected

Greetings!

Friends, tell me how you can overcome the error "Cyclic dependency detected". I make two calculated columns that calculate the same value - but in different units. The first column counts the implementation in rubles, and the second column counts the implementation in weight units.

How can I avoid this error?

DAX below:

-- first DAX
Sales in rubles =
VAR CompanyID = 'Sales '[Company]
VAR ContractID = 'Sales '[Document]
VAR ContractDate = 'Sales '[Date_document]
RETURN
CALCULATE(SUM('Sales '[Sales_rub]),
FILTER(ALL('Full'[Company],'Full'[Document], 'Full'[Date_document], 'Full'[Date_approval]),
'Full'[Company] = CompanyID && 'Full'[Document] = ContractID && 'Full'[Date_document] = ContractDate && 'Full'[Date_document] >= 'Full'[Date_approval]
)
)
DAX: cyclic dependency detected

-- second DAX
Sales in kilograms =
VAR CompanyID = 'Sales '[Company]
VAR ContractID = 'Sales '[Document]
VAR ContractDate = 'Sales '[Date_document]
RETURN
CALCULATE(SUM('Sales '[Sales_kg]),
FILTER(ALL('Full'[Company],'Full'[Document], 'Full'[Date_document], 'Full'[Date_approval]),
'Full'[Company] = CompanyID && 'Full'[Document] = ContractID && 'Full'[Date_document] = ContractDate && 'Full'[Date_document] >= 'Full'[Date_approval]
)
)

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Following up on my last post... Here's how to make PBI work with incremental refresh when using files: Chris Webb's BI Blog: Keep The Existing Data In Your Power BI Dataset And Add New Data To It Using I...

 

And here's the article that Chris refers to at the beginning of his article: Incremental refresh for files in a Folder or SharePoint - Power BI — Powered Solutions 

View solution in original post

13 REPLIES 13
daxer-almighty
Solution Sage
Solution Sage

@ReyCarter

 

No. It does NOT require a date field. Please check out the links I posted.

Anonymous
Not applicable

Following up on my last post... Here's how to make PBI work with incremental refresh when using files: Chris Webb's BI Blog: Keep The Existing Data In Your Power BI Dataset And Add New Data To It Using I...

 

And here's the article that Chris refers to at the beginning of his article: Incremental refresh for files in a Folder or SharePoint - Power BI — Powered Solutions 

Incremental updates as far as I understand are available on premium rates and the cloud, we have Desktop.
We plan to switch to the Report Server

Anonymous
Not applicable

@ReyCarter 

 

It's a very, very, very bad idea to put CALCULATE in a calculated colum of a fact table. You'll feel the heat very soon. Fact tables are usually very big in terms of the number of rows. Forcing the engine to make millions (or many more) context transitions is nothing more than just asking for trouble with performance. And, of course, the use of CALCULATE is also the source of your troubles with cyclic references.

I really like these answers. What's the solution?
What is good or bad is useful. But it doesn't solve the problem..

Anonymous
Not applicable

@ReyCarter 

 

Please let me ask you this question: Why don't you use the power of Power Query to calculate these columns? In fact, this is the very place you should be going to in order to calculate such things, especially on fact tables. The calculation will not only be faster (which is important when refreshing the model). The column will also be compressed in an optimal way which means your DAX will be more performant.

We have fact tables divided by year into different tables (2020,2021). Only the 2021 table is updated with new data.
In powerbi, we have a fact table formed by the formula UNION(2020,2021).
Power Query is not clear how to use it here, since there is no general table in it, there are only two tables for the years 2020,2021 - how to calculate, for example, a column for a fact table that is not physically present in PQ?

Anonymous
Not applicable

@ReyCarter 

 

I can see you are doing things the wrong way all along. If you have a model where tables are in an Import mode (and I assume you do), then all the tables are imported through Power Query. You should do all our ETL in Power Query, not in DAX. UNION on 2 fact tables in DAX? Very bad idea. Please do all your data preparation in Power Query. This is the right and correct way. Power Query is a data mashup engine for a reason... DAX is for Data Analysis (Data Analysis eXpressions). Please note that if you don't program your PBI correctly from the start, you'll be having huge problems later down the line. Trust me: You don't want to do things incorrectly.

OK, how do you load data from 500 Excel files into Power Query with the same format? You will update the report for 3 hours when a new file appears or a single file in a folder is updated.
We have a new Excel file in the folder and the division into several tables by year is made so that the old data is not updated, but only the data for 2021 (current) is updated.
When analyzing data, we work with a common table (including all periods). If you have a better solution, tell me - how can I do this task correctly?

Anonymous
Not applicable

@ReyCarter 

 

I've created 500 Excel files in a folder on my PC. In each there is a table with 20 columns (a mix of integers, floats and text) and 365 rows. I used the Folder connector in Power Query to load this data, automatically consolidate into one table and then load into PBI Desktop. The process of fully refreshing all data takes not more than about 10 seconds.

 

So, the question is: How many rows do you have in each of the files that you are worried about the refresh time? I'm not sure how to do an incremental refresh when loading data from Excel files and if it's possible at all. I'll have to find out. But if you don't have monstrous Excel files with tens of thousands of rows, then I can't see why you couldn't do what I've done...

The fact is that we have 500 files and each has 10-15 thousand lines. It takes us 3 hours. I say this not because I'm making it up, but as it is in practice.
10 thousand is not monstrous files, monstrous is 800 thousand and millions.

Or even on a few tabs like to do.I'll look at your link about incremental updates - but as far as I understand it requires a field with the date of addition - we do not have it.
More precisely, there is but it does not match the current one.

Therefore, it was decided to do it in the only possible way.

amitchandak
Super User
Super User

@ReyCarter , Try new columns like

 

Sales in rubles =
CALCULATE(SUM('Sales '[Sales_rub]),
FILTER('Full',
'Full'[Company] = earlier('Full'[Company]) && 'Full'[Document] = earlier('Sales '[Document]) && 'Full'[Date_document] = earlier('Sales '[Date_document])
&& 'Full'[Date_document] >= earlier('Full'[Date_approval])
)
)


Sales in kilograms =
CALCULATE(SUM('Sales '[Sales_kg]),
FILTER('Full',
'Full'[Company] = earlier('Full'[Company]) && 'Full'[Document] = earlier('Sales '[Document]) && 'Full'[Date_document] = earlier('Sales '[Date_document])
&& 'Full'[Date_document] >= earlier('Full'[Date_approval])
)
)

Does not work
It is not possible to define a single value for the Company column in the Full table. This can happen if the measure formula refers to a column containing multiple values to get a single result, without specifying an aggregate, such as MIN, MAX...

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors