I'm working on a Azure Data Factory Pipeline and have following challenge currently:
I'd like to make an API call (Post) which requires some data with the syntax of array and in it, multiple objects.
Now - both, the data retrieving (from SQL db) and API call work when used independently (In case of the API call: I've been using hardcoded mock data for the body). The challenge is in connecting both of them. That means:
I'd like to get multiple rows out of a SQL table, convert them to the required json structure and fill that data then into the API call.
See picture below:
In simple steps explained again:
- Get rows from SQL table
- Convert each row into an object e.g.
{ "somekey": valueOfRow} - Collect all objects in an
array - Provide
arraytoAPIcall
I'm just unsure how to proceed with the connection.
Additional Information
As requested, some further detailed information.
Currently the API call uses following hardcoded mockdata:
[{"idType": "ID_ISIN", "idValue": "US0123456789" }]
From the dataflow I'll get rows with one column called isin (with row values such as US0123456789)
The goal is to fill API's body dynamically such that it receives something like this:
[
{
"idType": "ID_ISIN",
"idValue": "US0123456789"
},
{
"idType": "ID_ISIN",
"idValue": "US9876543210"
},
{...}
]
I saw that one can achieve something similiar with SQL query - see: https://learn.microsoft.com/en-us/azure/azure-sql/database/json-features?view=azuresql
But I'd miss the part "idType": "ID_ISIN" in each row/object.





idTypeandidValueare the names of columns in your SQL table? OrID_ISINis the column name andUS0123456789is its value