Home:ALL Converter>ADF - How should I copy table data from source Azure SQL Database to 6 other Azure SQL Databases?

ADF - How should I copy table data from source Azure SQL Database to 6 other Azure SQL Databases?

Ask Time:2020-11-24T07:07:52         Author:AdventurGurl

Json Formatter

We curate data in the "Dev" Azure SQL Database and then currently use RedGate's Data Compare tool to push up to 6 higher Azure SQL Databases. I am trying to migrate that manual process to ADFv2 and would like to avoid copy/pasting the 10+ copy data actives for each database (x6) to keep it more maintainable for future changes. The static tables have some customization in the copy data activity but the basic idea follows this post to perform an upsert.

How can the implementation described above be done in Azure Data Factory?

I was imagining something like the following:

  1. Using one parameterized link service that has the server name & database name configurable to generate a dynamic connection to Azure SQL Database.
  2. Creating a pipeline for each table's copy data activity.
  3. Creating a master pipeline to then nest each table's pipeline in.
  4. Using variables loop over the different connections an passing those to the sub-pipelines parameters.

Not sure if that is the most efficient plan or even works yet. Other ideas/suggestions?

Author:AdventurGurl,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/64977931/adf-how-should-i-copy-table-data-from-source-azure-sql-database-to-6-other-azu
Leon Yue :

we can not tell you if that's the most efficient plan. But I think so. Just make it works.\nAs you said in the comment:\n\nwe can use Dynamic Pipelines - Copy multiple tables in Bulk with\n'Lookup' & 'ForEach'. we can perform dynamic copies of your data\ntable lists in bulk within a single pipeline. Lookup returns either\nthe lists of data or first row of data. ForEach - @activity('Azure\nSQL Table lists').output.value ;\n@concat(item().TABLE_SCHEMA,'.',item().TABLE_NAME,'.csv') + This is\nefficient and cost optimized since we are using less number of\nactivities and datasets.\n\nIn usually, we also will choose same solution with you: dynamic parameter/pipeline, lookup + foreach active to achieve the scenario. In one word, make the pipeline has a strong logic, simple and efficient.",
2020-11-27T05:23:19
yy