You have a lot of options. The simplest is probably to use Sqlite's ".dump" shell command to generate the SQL. There are a number of options or ways you can do this. You could use ".dump" which dumps all the schema and data, and then make any modifications to the dump-file and then load it into postgres.
To just dump row data for table "my_table" as CSV you can use:
sqlite> .headers on
sqlite> .mode csv
sqlite> .output model_data.csv
sqlite> SELECT * FROM my_table;
Then you can bulk load the CSVs using Postgres:
COPY table_name(col1, ...)
FROM 'model_data.csv'
DELIMITER ','
CSV HEADER;
Peewee can dump data in a variety of formats, so that's one option:
models = [User, Tweet]
for model in models:
outfile = '%s.csv' % model._meta.table_name
with open(outfile, 'w', newline='') as f:
writer = csv.writer(f)
row_iter = model.select().tuples().iterator()
writer.writerows(row_iter)
Another option would be to use a Postgres foreign-data wrapper. Then connect to psql db, attach sqlite database using fdw, and do a straight copy from one to the other using psql shell.
https://github.com/pgspider/sqlite_fdw
sqlite3that can work with Sqlite directly. For the Postgres side there is psycopog2. You could use the first to pull out of SQLite and transfer to the second for entry into Postgres. There is also pgloader in particular Sqlite->Postgres.