I just published the report on my test env but it crashes because it consumes a lot of memory. My test databases have a limited number of table rows (max 1M).
Is it possible to optimize the report so it can handle million rows of data, error message attached.
Raising the tier is not an option as this will make the whole proposition far too expensive.
Data is currently being imported, will changing to direct query help?
I would suggest you use directquery instead of import since your dataset is a limited number of table rows (max 1M)
and here is a document about Optimization guide for Power BI, please refer to it: