0

I am running a python script using django ORM to handle calls to the database.

I have the following code:

with ThreadPoolExecutor(max_workers=2) as executor:
    executor.submit(process, 2)
    executor.submit(process, 1)

@transaction.atomic
def process(counter)
    MyModel.objects.filter(user_id=counter).delete()
    users = <create User object)
    MyModel.objects.bulk_create(users)

I am using django default behavior which sets the auto-commit to True.

When I debug the code, it looks that django uses the same connection among the application, and therefore, the last process that will exit process method, will commit all the transactions that are using the connection.

How can I prevent it to happen? I would like django to close (and commit) the connection at the end of each method so each thread could be handled separately.

3
  • Do you need these users in the code after? Or could you use celery? Commented Dec 3, 2017 at 21:01
  • I don't need the objects in the code afterwards Commented Dec 3, 2017 at 21:03
  • 1
    As a rule of thumb, whenever you want to do something concurrently inside Django, you should use celery project Commented Dec 3, 2017 at 22:00

1 Answer 1

1

I suggest using Celery. You will need Redis or RabbitMQ, to send messages between Django and Celery.

@app.task
def process(counter):
    MyModel.objects.filter(user_id=counter).delete()
    users = <create User object)
    MyModel.objects.bulk_create(users)

After that, use apply_async

process.apply_async((2,))
process.apply_async((1,))

Setting this up is a bit more complicated, but works well.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.