I have 2 Python scripts on a Raspberry Pi, 1 runs in a background shell emptying a queue, the other runs in the foreground adding user inputs to the queue. I've build a version where the queue is stored as a SQLite database and in order to make this work each script has to connect to the database before each operation and disconnect afterwards which avoids locking conflict but slows the process down significantly. This overhead means I am actually able to hammer the inputs fast enough to confuse the script and make it miss/ignore some inputs. Is a single SQLite datbase the fastest method for storing my shared queue or is there some faster alternative (possibly one that uses RAM instead of writing to disk) that both scripts can access quicker?
(Note: I'm open to drastic suggestions like switching language to NodeJS)
I didn't post the code because it's more a question of appropriate technology for the job but if you want to see my current repo it's at https://github.com/martinjoiner/bookfetch-scanner-python
if 30 sec passed from the last hammering: INSERT...in pseudocode?