0

I tried to use this article in successfully copying data from one table to another using Dataflows in Data factory. Now my scenario is to handle multiple tables in the DB. the above example is for one of the table.

I tried to follow the next article (link) in same series and have created View and For each loop but now wondering how should I put the input in Data Flow activity.

Any ideas or if any one tried the same thing.

Thanks

1 Answer 1

1

You will need to use a parameterized dataset that uses a dataset parameter for the name of the table. Then, pass a string parameter from the Foreach activity that contains the table name into the dataset parameter for that data flow activity. This will all be accomplished from the pipeline.

Sign up to request clarification or add additional context in comments.

8 Comments

Mind giving some more details... I still didnt get how you are going to pass a string parameter from Foreach each activity
It's described in step 8 in the article link that you included in the 2nd paragraph. Use the iterator from the foreach that is a string table name and pass it to the table name in the dataset parameter. In your case, just use a Data Flow activity instead of the Copy Activity in the example.
I fear I am still not able to figure out a way on how to do this. How will I map all my source table with each sink table
With Data Flows, you would use schemaless datasets for your Source and Sink transformations and either use "automap" on the Sink or write a rule using rule-based mapping to map columns based on patterns.
This video helps explain how to do it: youtu.be/Sj15Yjwai1A
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.