3

I encountered a SQL-Data-Migration (SQL Server) problem and i hope, you can guide me in the right direction.

Assume, we have the table DataTable (names simplified) with the following columns:

DataID | SomeForeignKey | SpecificDataValues | OtherSpecificDataValues
int    | int            | String             | String
-------+----------------+--------------------+------------------------
0      | 1              | ['1','2']          | ['1', '2']

where SpecificDataValues and OtherSpecificDataValues are JSON arrays (like ['1', '2'])

Now i want to migrate this table (with an SQL Migration script at best) to a new table:

DataValuesTable
DataID | SomeForeignKey | SpecificDataValues | OtherSpecificDataValues
-------+----------------+--------------------+------------------------
0      | 1              | 1                  | 1
1      | 1              | 2                  | 2

So, i basically want to generate a new row in a new table for each value, stored in "SpecificDataValues" and "OtherSpecificDataValues"

I already checked, that there are SQL functions to work with JSON (OPENJSON, JSON_QUERY) but i was not able to produce the desired result using this tools.

I hope, you can show me the right direction.

1 Answer 1

6

One solution is to use OPENJSON twice. It will generate Array(2) x Array(2) = 4 rows, the ones you're interested in are the ones where the index positions match:

SELECT DataID, SomeForeignKey, j1.value, j2.value
FROM t
CROSS APPLY OPENJSON(SpecificDataValues) AS j1
CROSS APPLY OPENJSON(OtherSpecificDataValues) AS j2
WHERE j1.[key] = j2.[key]

Another solution is to use OPENJSON and JSON_VALUE together:

SELECT DataID, SomeForeignKey, j1.value, JSON_VALUE(OtherSpecificDataValues, CONCAT('$[', CAST(j1.[key] AS INT), ']'))
FROM t
CROSS APPLY OPENJSON(SpecificDataValues) AS j1

Note that your "JSON" is invalid. Strings must be enclosed inside ".

Demo on db<>fiddle

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.