1

I have a custom postgres type that looks like this:

CREATE TYPE "Sensor".sensor_telemetry AS
(
    sensorid character varying(50),
    measurement character varying(20),
    val numeric(7,3),
    ts character varying(20)
);

I am trying execute a call to a postgres function that takes an array of this type as a parameter.

I am calling this function with SQLAlchemy as follows:

result = db.session.execute("""select "Sensor"."PersistTelemetryBatch"(:batch)""", batch)

where batch looks like:

{
    "batch" : [
        {
            "sensorID" : "phSensorA.haoshiAnalogPh",
            "measurement" : "ph",
            "value": 8.7,
            "timestamp": "2019-12-06 18:32:36"
        },
        {
            "sensorID" : "phSensorA.haoshiAnalogPh",
            "measurement" : "ph",
            "value": 8.8,
            "timestamp": "2019-12-06 18:39:36"
        }
    ]
}

When running this execution, I am met with this error:

sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) can't adapt type 'dict'

I'm guessing that psycopg2 is complaining about the custom type array entry as a dict, because I can supply dictionaries as parameters to other pg function executions (but these dictionaries are not contained within an array like this case). Am I correct about this?

How do I go about correctly passing an array of these objects to my pg function?

3
  • Convert your dicts to tuples: stackoverflow.com/questions/59031648/… Commented Dec 8, 2019 at 5:47
  • using the code in your linked answer (the psycopg/literal response), the list of tuples looks like [('value', 'measurement', 'timestamp', 'sensorID'), ('value', 'measurement', 'timestamp', 'sensorID')]. Should I just be supplying the values of these keys here? Additionally, does the order of the values in each tuple need to be the same as the pgsql type I have created? @IljaEverilä Commented Dec 8, 2019 at 7:38
  • The tuples should contain the values, and they should reflect the order of fields of your user-defined-type. Come to think of it, this might be a nice use case for Python's operator.itemgetter(), which iirc returns tuples of values, if given multiple keys. Commented Dec 8, 2019 at 13:58

1 Answer 1

2

A straightforward way to pass the data is to convert the list of dicts to a list of tuples in Python and let psycopg2 handle adapting those to suitable SQL constructs:

from operator import itemgetter

ig = itemgetter("sensorID", "measurement", "value", "timestamp")
batch = {"batch": list(map(ig, batch["batch"]))}
query = """
        SELECT "Sensor"."PersistTelemetryBatch"(
            CAST(:batch AS "Sensor".sensor_telemetry[]))
        """
result = db.session.execute(query, batch)

Another interesting option when your data is a list of dict would be to use json_populate_record() or json_populate_recordset(), but for those you'd have to fix the keys to match:

import json

batch = [{"sensorid": r["sensorID"], 
          "measurement": r["measurement"],
          "val": r["value"],
          "ts": r["timestamp"]}
         for r in batch["batch"]]
batch = {"batch": json.dumps(batch)}

query = """
        SELECT "Sensor"."PersistTelemetryBatch"(ARRAY(
            SELECT json_populate_recordset(
                       NULL::"Sensor".sensor_telemetry,
                       :batch)))
        """
result = db.session.execute(query, batch)
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.