Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi,
I am trying to understand how the dataset is caching the data pushed through the REST calls. In one of the video(https://www.youtube.com/watch?v=PpiUsSCXFhM) it says, the data won't be imported, but will be cached.
Isnt it possible to generate a report that uses the data across a date range say for an year or more and having more than 200,000 rows? One alternative is to specify no retention policy, up to 5,000,000 max rows stored per table in ‘none retention policy’ dataset. Then isn't it possible to backup the data beyond the limit?
As per the basicFIFO policy(https://msdn.microsoft.com/en-us/library/mt186545.aspx) next 10,000 rows after 200, 000 will be inserted after removing the very first 10,000 row.Is it possible to backup the rows deleted?
What if I want to generate a report that includes the data from these 210,000 or more rows?