Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi All,
I have some data stored in my Azure container within many subfolders as a parquet file. I want to interact with this data and update it. I am trying to move this parquet file data to Dataverse using Dataflows using the ADLS Gen 2 connector but the Power Query does not provide me the capability to navigate to my file which is buried inside many subfolders. I have also tried to place the file outside all subfolders but on connecting to the container the Power Query shows files that are available inside the top folders inside the container but not the ones that are outside any folders.
Can someone please guide me on how I can work around this. Also, if there is a better way to do the same thing, I am open to ideas.
Ideally yes, but alternatively, I am moving the data to dataverse and then I can perform crud operations on dataverse. If there is a way I can update the data in parquet files I would want to do it.
Please explain what you mean by "interact with it" - are you planning to write to the Parquet files?