2

I am trying to add timezone support to a python/sqlalchemy script. I have studied timezones and use pytz. I understand I should do as much as possible in UTC and only display local times. Due to the nature of the application, this is very easy.

Everything works, except that when I insert UTC data, it gets somehow converted to local time (BST) before entering the database, and I am completely lost why this happens and how I can avoid it.

My table (postgres) is defined as follows (relevant part only):

fpp=> \d foo;
                                     Table "public.foo"
  Column   |            Type             |                              Modifiers                          
-----------+-----------------------------+-------------------------------------------------------------
 x         | integer                     | 
 y         | integer                     | 
 when_utc  | timestamp without time zone | 

I have debugged sqlalchemy when it does an insert. This is what happens:

2016-07-28 17:16:27,896 INFO sqlalchemy.engine.base.Engine INSERT INTO 
  foo (x, y, "when_utc") VALUES (%(x)s, %(y)s, %(when_utc)s) RETURNING fb_foo.id

2016-07-28 17:16:27,896 INFO sqlalchemy.engine.base.Engine {
  'when_utc': datetime.datetime(2016, 7, 11, 23, 0, tzinfo=<UTC>), 'y': 0, 'x': 0}

So it inserts in UTC 11/7/2016 23:00:00. When I query it in command line psql, this is what I find:

fpp=> select x,y,when_utc from foo;
 x | y |      when_utc       
---+---+---------------------
 0 | 0 | 2016-07-12 00:00:00
(1 row)

What is going on? I am adamant nothing modifies the field in between. It just seems to add the DST hour to my database entry. Why? How can I avoid this?

R

2
  • I found a workaround chaging timezone from GB to UTC in postgresql.conf and recreating the database. It works now but this can't be the only way to avoid forced conversions. Commented Jul 28, 2016 at 18:08
  • 2
    You can change the timezone of a PostgreSQL connection in SQLAlchemy without changing postgresql.conf. See this answer. Commented Jul 28, 2016 at 20:07

2 Answers 2

1

The problem is that your column type is timestamp without time zone, when it should instead be timestamp with time zone. This can be achieved in SqlAlchemy with DateTime(timezone=True) when declaring the column. Unfortunately the default is False... See the documentation for more information https://docs.sqlalchemy.org/en/13/core/type_basics.html#sqlalchemy.types.DateTime

Sign up to request clarification or add additional context in comments.

Comments

1

I was struggling with this problem as well. I found the previous answer very helpful if you are using sqlalchemy core. But, it wasn't my case as I was using sqlalchemy orm. If you are using orm you might be interested in how I solved it.

I did it by mapping the column like this:

import datetime

from sqlalchemy import DateTime
from sqlalchemy.orm import Mapped
from sqlalchemy.orm import mapped_column

class Event(Base):
    __tablename__ = "event"
    occurred_on: Mapped[datetime.datetime] = mapped_column(DateTime(timezone=True))

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.