I currently have a website that has twice been suspended by my hosting provider for "overusing system resources". In each case, there were 300 - 400 crashed copies of one of my PHP scripts left running on the server.
The scripts themselves pull and image from a web camera at home and copy it to the server. They make use of file locks to ensure only one can write at a time. The scripts are called every 3 seconds by any client viewing the page.
Initially I was confused, as I had understood that a PHP script either completes (returning the result), or crashes (returning the internal server error page). I am, however, informed that "defunct scripts" are a very common occurrence.
Would anyone be able to educate me? I have Googled this to death but I cannot see how a script can end up in a crashed state. Would it not time out when it reaches the max execution time?
My hosting provider is using PHP set up as CGI on a Linux platform. I believe that I have actually identified the problem with my script in that I did not realise that flock was a blocking function (and I am not using the LOCK_NB mask). I am assuming that somehow hundreds of copies of my script end up blocked waiting for a resource to become available and this leads to a crash? Does this sound plausible? I am reluctant to re-enable the site for fear of it being suspended again.
Any insights greatly appreciated.
setInterval()when what you should be doing instead is callingsetTimeout()in the success handler for the previous ajax call. If you areflock()ing a file, make sure you remember toLOCK_UNit as soon as you are finished with it, and also remember that if you are reading it you don't needLOCK_EX,LOCK_SHwill suffice.flock()blocks and when you're running on *nix the script timeout won't occur if the script never gets to the top of the lock queue. It's pretty silly that this is the case, but apparently it is. There is also a note on theset_time_limit()manual page to that effect.