We have a data lake container weith three folders a,b,c. Each folder has 3 files a1,a2,a3,b1,b2,b3,c1,C2,c3. Now we need to design a pipeline which will dynamically do incremental load from the folders to a blob stroarge with same name file as souce. Incremental load is implemented by me in dataflow. We have other dataflow dependancy as well so we can't use copy activity but dataflow. I am unable to integrate get metadata activity with the dataflow where I am expecting some help.
We have a data lake container weith three folders a,b,c. Each folder has 3 I tried with parameters and variables.But I did not got the desired output. I used get metadata child item. Then a foreach loop. Inside foreach I tried with another fireaceach to get the files. I have used an append variable to append the data. I have already implemented the upsert logic for a single table in dataflow. If I am passing second get matadata active output (inside foreach) to dataflow it does not accepts. The main problem I am facing is to integrate the dataflow with foreach in dataset level. Because the dataset of the dataflow will be dependent on get metadata's output.









