0

i have a script running threads i.e. it uses a db of about 50,000 entries and processes them in batches of 400 each .. as such the script runs fine for first.. but gradually dies out.. at an estimate the script should run for about 6 hours but it dies out after an hour or so.

there's no such problem with the code as it works fine for first time, i unset all the not required variables and other stuff to increase the performance but still after an hour or so it dies out.

i use 4 parallel connections to mysql db to achieve this target making up limited 4 connections to db.

why does the script speed damp after time..??

------------------------- DONE!! exhausted memory and was creating some std class obj and now using curl_multi_exec.. works like a charm..!! :)

2 Answers 2

1

It could be running out of memory? It's hard to tell without seeing the script, maybe offloading some of the work onto the DB depending on your queries could speed some things up - need more information obviously.

Sign up to request clarification or add additional context in comments.

1 Comment

i increased the memmory limit to 256mb. made the php time out to 0. it just keeps getting slower and slower gradually.. starts with a pump n then starts slowing down. the script is about 8000 lines long.. so cant post it here..
0

It can only really be a leak.

Your steps should be:

Create 4 handlers and give each a connection to the DB
While you haven't processed the whole DB
For every handler, execute:
    Process 400 records

Now, as long as you're re-using the same connections, and your "process" step doesn't create anything without unlinking it afterwards, it should be able to go on forever.

2 Comments

Create 4 handlers and give each a connection to the DB - DONE While you haven't processed the whole DB For every handler, execute: Process 400 records - DONE this is what i'm doing right now.. i use persistent connection to my mysql db and made sure that the persistent time out is more than 50 seconds which is more than enough. i have four parallel connections. after analysis of the database i figured that at one point of time only one of the four thread is processing and then the other.
i'm using curl multiple times.. actually 50,000 times.. can that be an issue..??

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.