1

I'm hoping to build an optimized data JSON structure that only includes data, no names. I'll included the names in another JSON.

For example

[["1", "William"],["2", "Dylan"]]

I'm looking at "for json auto", running a query like this.

declare @t table(id int, name varchar(20))

insert into @t (id, name) values( 1, 'William')
insert into @t (id, name) values( 2, 'Dylan')

declare @result as varchar(max)

select id, name from @t for json auto  

However it includes the names with every value.

[{"id":1,"name":"William"},{"id":2,"name":"Dylan"}]

Is there a way to instruct SQL Server to omit the names and just return a string array?

I'll need to update a couple hundred queries, so I'm hoping for an answer that doesn't require too much modification on a basic query.

4
  • You're likely going to want string aggregation here, not the JSON functionality. Commented Jan 16, 2022 at 14:56
  • 2
    I don't believe your "optimized" json string is valid json. Commented Jan 16, 2022 at 14:56
  • @AaronBertrand .. most likely [“id”, null] Commented Jan 16, 2022 at 17:09
  • RE: Nulls, output null, not "" so if 2 name was empty [["1","William"],["2",null]] Commented Jan 16, 2022 at 17:12

2 Answers 2

1

Unfortunately, SQL Server does not support the JSON_AGG function or similar. You can hack it with STRING_AGG and STRING_ESCAPE.

You can either do this with a single aggregation and concatenating the row together

SELECT '[' + STRING_AGG(CONCAT(
    '["',
    id,
    '","',
    STRING_ESCAPE(name, 'json'),
    '"]'
    ), ',') + ']'
FROM @t t;

Or with a nested aggregation, aggregating first each row in an unpivoted subquery, then all rows together

SELECT '[' + STRING_AGG('[' + j.json + ']', ',') + ']'
FROM @t t
CROSS APPLY (
    SELECT STRING_AGG('"' + STRING_ESCAPE(value, 'json') + '"', ',')
    FROM (VALUES
        (CAST(id AS nvarchar(max))),
        (name)
    ) v(value)
) j(json);

db<>fiddle

I've assumed columns are not nullable. Nullable columns will need special handling, I leave it as an exercise to the reader.

Sign up to request clarification or add additional context in comments.

1 Comment

pay attention to NULL values( 1, null)
1

Not all that different from Charlie's but uses CONCAT_WS to remove some of the explicit " characters:

SELECT [json] = '[' 
       + STRING_AGG('["' + CONCAT_WS('", "', id, 
         STRING_ESCAPE(COALESCE(name,''), N'JSON')) 
       + '"]', ',') + ']'
FROM @t;

Output (after adding a 3rd row, values(3, NULL):

json
[["1", "William"],["2", "Dylan"],["3", ""]]

If you want the literal string null with no quotes:

SELECT [json] = '[' 
       + STRING_AGG('[' 
       + CONCAT_WS(', ', CONCAT('"', id, '"'), 
           COALESCE('"' + STRING_ESCAPE(name, N'JSON') + '"', 'null'))
       + ']', ',') + ']'
FROM @t;

Output:

json
[["1", "William"],["2", "Dylan"],["3", null]]

If you don't want the NULL value to present a column in the JSON, just remove the COALESCE:

SELECT [json] = '[' 
       + STRING_AGG('["' + CONCAT_WS('", "', id, 
         STRING_ESCAPE(name, N'JSON')) 
       + '"]', ',') + ']'
FROM @t;

Output:

json
[["1", "William"],["2", "Dylan"],["3"]]

If you don't want that row present in the JSON at all, just filter it out:

FROM @t WHERE name IS NOT NULL;

If that column doesn't allow NULLs, state it explicitly so we don't have to guess (probably doesn't hurt to confirm id is unique, either):

declare @t table(id int UNIQUE, name varchar(20) NOT NULL);

12 Comments

Hi Aaron, thanks, this is really great, but how am I going to do this with 100's of queries. I guess they will all have to be re-written, field by field?
@WilliamWalseth I wish I knew magic but if you want output that's different from the easy case (e.g. FOR JSON AUTO), yes? Maybe it's easier to have the consumer / application layer formulate your expected JSON output. I'm curious how you got into a state where you have 100s of queries that don't produce the output you want?
@WilliamWalseth I could envision a way to automate this to some degree, depending on where your queries currently live and how consistent the tables are that the queries go against, but the effort probably wouldn't be worth it (and would require lots of manual validation anyway).
Currently the queries produce XML for 100's of generated maintenance pages. Originally, these pages were created as simple HTML forms and table lists with page refreshes between every update. We're updating these pages to be more responsive (all in-browser rendering), and use JSON.
@WilliamWalseth I have no way to gauge the scope but is it possible to consume the output in XML and translate that to JSON, instead of reinventing all the queries individually? Maybe that should be the question if rewriting the queries isn't practical anyway...
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.