1

I need to generate a JSON file for a dataTables section on my web site. The format of the JSON should look like this:

{
        "data": [
                [
                        "Tiger Nixon",
                        "System Architect",
                        "Edinburgh",
                        "5421",
                        "2011/04/25",
                        "$320,800",
                ],
                [
                        "Garrett Winters",
                        "Accountant",
                        "Tokyo",
                        "8422",
                        "2011/07/25",
                        "$170,750"
                ],
        ]
}

This format looks like an array of arrays listed on the JSON Creation Functions (Table 9-42.) -- array_to_json.

However, my data is not stored as arrays in my database. It's typical row data.

But when I try to export to JSON using "row_to_json" -- the format is way off from what I need. See this SQLfiddle example.

Basically, I'm trying to figure out the best approach to make the JSON export look more like the format pasted above.

Should I be reformatting my rows into arrays first, and storing them in a new table, so when I use "row_to_json" they come out as arrays? Or is there a better way?

2 Answers 2

4

Select rows as arrays and use json_agg(array_to_json(...)):

select 
    json_build_object(
        'data', 
        json_agg(array_to_json(array[price::text, qty::text])))
from data;

                 json_build_object                  
----------------------------------------------------
 {"data" : [["11","11"], ["22","11"], ["33","11"]]}
(1 row)

You need a function introduced in Postgres 9.4, so SqlFiddle is useless.

Sign up to request clarification or add additional context in comments.

2 Comments

json_build_object is from 9.4
SQLFiddle is annoying anyway
1

You can use spyql for that:

$ psql -U my_user -h my_host -c "SELECT * FROM my_table" --csv my_db | spyql -Oindent=4 "SELECT list_agg(_values) AS data FROM csv TO json"
{
    "data": [
        [
            "1",
            "David",
            "5.1"
        ],
        [
            "2",
            "James",
            "2.1"
        ],
        [
            "3",
            "Joana",
            ""
        ],
        [
            "4",
            "Hellen",
            "9.8"
        ]
    ]
}

We are using psql to export a query/table to CSV and then spyql to convert the CSV to json. The _values variable in spyql contains a list with the value of each column before automatic type casting, therefore all values are strings. If you prefer having the casted values you can use the cols variable instead. list_agg aggregates all rows into a single array. Here's what you would get if used cols instead of _values (notice also the handling of NULL values):

$ psql -U my_user -h my_host -c "SELECT * FROM my_table" --csv my_db | spyql -Oindent=4 "SELECT list_agg(cols) AS data FROM csv TO json"
{
    "data": [
        [
            1,
            "David",
            5.1
        ],
        [
            2,
            "James",
            2.1
        ],
        [
            3,
            "Joana",
            null
        ],
        [
            4,
            "Hellen",
            9.8
        ]
    ]
}

For the record, I created the sample table with the following SQL command:

CREATE TABLE my_table AS
SELECT *
FROM (VALUES (1, 'David', 5.1), (2, 'James', 2.1), (3, 'Joana', NULL), (4, 'Hellen', 9.8)) 
  AS t(id, name, score);

Disclaimer: I am the author of spyql

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.