I want to build SQL query to pass into spark-redshift reader's "query" option. I'm trying to use psycopg2, so I do something like this:
from psycopg2 import sql
query = sql.SQL(
"select * from {} where event_timestamp < {}"
).format(
sql.Identifier("events"),
sql.Literal(datetime.now())
).as_string()
But it tells me that I need to pass context (connection or cursor) to as_string(). I'm not able to, because I don't have any connection.
Should I use plain string format in this case with some escaping?
Or is there any way to pass some mock context there? And why it needs connection to build query string? Does SQL query change depending on connection?