0

guys!
My application is a bot. It simply receives a message, process it and returns result.
But there are a lot of messages and I'm creating separate thread for processing each, but it makes an application slower (not a bit).
So, Is it any way to reduce CPU usage by replacing threads with something else?

1
  • @david-heffernan Looks like. Threads should sync their data and there are cpu costs. Creating and destroying threads costs CPU too. Commented Feb 20, 2011 at 23:52

4 Answers 4

3

You probably want processes rather than threads. Spawn processes at startup, and use Pipes to talk to them.

http://docs.python.org/dev/library/multiprocessing.html

Sign up to request clarification or add additional context in comments.

1 Comment

I'd still recommend using queues instead of pipes most of the time - way easier in my opinion (except for die-hard UNIX programmers obviously ;) )
1

Threads and processes have the same speed. Your problem is not which one you use, but how many you use.

The answer is to only have a fixed couple of threads or processes. Say 10. You then create a Queue (use the Queue module) to store all messages from your robot. The 10 threads will constantly be working, and everytime they finish, they wait for a new message in the Queue.

This saves you from the overhead of creating and destroying threads. See http://docs.python.org/library/queue.html for more info.

def worker():
    while True:
        item = q.get()
        do_work(item)
        q.task_done()

q = Queue()
for i in range(num_worker_threads):
     t = Thread(target=worker)
     t.daemon = True
     t.start()

for item in source():
    q.put(item)

q.join()       # block until all tasks are done

2 Comments

Considering that threads in CPython don't run in parallel (look up GIL), lightweight processes are the way to go. Lightweight processes by the multiprocessing API are the better approach - and since processes follow the thread api mostly it shouldn't be too hard to change the code. Though using a Pool is still a good idea (processes do have those too)
@Voo: Threads in Python are not for utilizing multiprocessors. They are simply for making things run more smoothly. But considering Vladimirs answer to darkporter, you are right that processes is the way to go.
0

You could try creating only a limited amount of workers and distribute work between them. Python's multiprocessing.Pool would be the thing to use.

Comments

0

You might not even need threads. If your server can handle each request quickly, you can just make it all single-threaded using something like Twisted.

1 Comment

I'm currently using Twisted and processing is not so fast to leave it in the same threaded.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.