0

I'm writing REST service on Python(Django), and this service should incorporate with another REST service by it's API.

Here some code and time of lines:

connection = statServer("myname", "mypassword")

q1 = connection.getJSONdict("query1") # approximately 15 seconds 
q2 = connection.getJSONdict("query2") # approximately 20 seconds
q3 = connection.getJSONdict("query3") # approximately 15 seconds

# my processing approximately 0.01 of second
# merge q1 + q2 + q3

It's clear to me that each request getJSONdict("query") actually do nothing apart from waiting on I/O, so it doesn't consume processor time.

Requests are sequentially, thus I could run them on separate threads. I know allegedly that Python don't provide real threading, but in my case I have waiting on I/O so I can to do something like threading.

I think this it is real often user case for Python, and if you have dealt with something like this task, please help to solve mine.


I have thoughts about Fork/Join framework or better will be ThreadExecutorPull to consume my requests (and for reusing threads) from all requests in my REST service.


1 Answer 1

2

I have managed to do it by myself.

from multiprocessing.pool import Pool, ThreadPool
# ... others imports

# You can dicede here to use processes or threads,
# if you want threads change Pool() to ThreadPool()
pool = Pool()
connection = statServer("myname", "mypassword")

res = pool.map(connection.getJSONdict, ["query1", "query2", "query3"])
print(res)
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.