1

I have a php script that comes down to the following:

while ($arr = mysql_fetch_array($result))
{
   $url = $arr['url'];
   exec("curl $url > /dev/null &");
} 

$url will represent a remote script.

My question is, what can I expect if I try to cycle through 2,000 URLs.

Will opening that many CURL connections cripple my server? Could I fire all 2000 in less than one minute?

What I am trying to do is prevent my users from having to setup cronjobs by opening connections and running their remote scripts for them.

Can you guys advise? I'm out of my league today.

Hudson

0

2 Answers 2

3

Take a look at curl_multi_init. It will not fire multiple processes so it should be softer on your server.

I would advise you to fire only 3-15 at a time, depending on the load the server can handle and the complexity of the scripts. 2000 at a time will probably make you run out of file descriptors or other limit.

Sign up to request clarification or add additional context in comments.

8 Comments

It may actually be cleaner and just as quick to us exec("curl $url > /dev/null &"); Because > /dev/null & disregards the need to have output completed before moving on to next curl because it disregards any output. I am not certain.
@atw You can configure curl to send a HEAD request instead and then the remote servers will send only headers, which you can disregard anyway (sending to /dev/null doesn't make it magically faster)
Say the remote script takes three minutes to complete (data compiler) would the returned HEAD act as a read receipt, and still guarantee that the connection will remain open for the entire 3 minutes?
@atw With another plus: with PHP, if you only make requests in batches of say 5, you can easily tell when those 5 have ended and then fire another 5. That's a bit harder to do with jobs. Of course, the best option (without going to async I/O) would be to use Java's thread pool or something to that effect so that you could always have 5 requests taking place (as soon one finished, another would be started).
@atw Yes, it will remain open. The only difference is the http response will have no body (the echoes/etc in the PHP script will have no effect)
|
1

You need to keep an eye on the number of opened files (connections) so you don't hit the file-max limit on your server.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.