I've been using Postgres to store JSON objects as strings, and now I want to utilize PG's built-in json and jsonb types to store the objects more efficiently.
Basically, I want to parse the stringified JSON and put it in a json column from within PG, without having to resort to reading all the values into Python and parsing them there.
Ideally, my migration should look like this:
UPDATE table_name SET json_column=parse_json(string_column);
I looked at Postgres's JSON functions, and there doesn't seem to be a method of doing this, even though it seems pretty trivial. For the record, my JSON objects are just one-dimensional arrays of strings.
Is there any way to do this?
jsonandjsonbdon't store the objects more efficiently. If all you do is store and retrieve JSON values, you might as well stick withtext. The JSON types are interesting if you want the database to do something with the data, like process them (jsonb) or check their integrity (json).jsonbdoes store the objects more efficiently, since it is storing it in binary representation instead of string representation. For big data sets this can be a critical gain in storage space, which also affects index size and more. Secondly, storing JSON as text is as stupid as storing integers as text. Even if there is no practical need for it, data clarity should always be enforced, and as you said, having the database checking the integrity is never a bad idea.jsoncould hardly be a loss. But I am curious if you actually tested thatjsonbneeds less space thantext.textis stored compressed by default if it exceeds a certain length (TOAST). AndINSERTandSELECTwill take slightly longer because the value has to be parsed or unparsed.jsonandjsonbtypes to store the objects more efficiently. But it's no use having a theoretical discussion about this here.