I'm trying to insert Numpy array into PostgreSQL. Tried to do like this
def write_to_db(some_arr, some_txt):
""" insert a new array into the face_ar table """
sql = """INSERT INTO test_db VALUES(%s,%s);"""
conn = None
try:
params = config()
conn = psycopg2.connect(**params)
cur = conn.cursor()
cur.execute(sql, (some_arr, some_txt))
conn.commit()
cur.close()
except (Exception, psycopg2.DatabaseError) as e:
print(e)
finally:
if conn is not None:
conn.close()
Before it i created a table in my DB
create table test_db (encodings double precision[], link text);
Finally i got an error: "can't adapt type 'numpy.ndarray'"
I need to write Numpy array of 125 float64 items and small text like a link in each row. There will be a few millions of rows in my project. Just speed of reading and size of DB are important. As i got it is not possible to insert Numpy array directly, and need to convert it to another format. First idea i got was to convert it to Binary data and save it to DB, but i dont know how to do it and how to get it back from DB in Numpy array format.