Why you would want to have a web request during for 10 minutes? PHP is not meant for that at all. I've seen some backoffice process, for example, generating files to be downloaded, taking that long or even longer to complete, and users can wait for it, but for sure it wont be trustable, the process could die for many reasons (hardware limitations, config limitations, network failure, etc) and even more, in a shared the server will explicitely kill your process no matter what the configuration is.
In a dedicated or VPS, for sure you can modify directives and allow your script to run forever, but the question is not how to do it, but if that is the approach that you need. Probably it is, i dont say thats an absolute forbidden approach, but probably it is not. Alternatives could be:
- Have a cronjob running your sh on a regular basis. When there is a web request, you can serve precalculated data.
- If you dont want to have it generated too often, you can have it running often, but checking if it has to actually continue running or not, probably checking database or a file, and then when there is a request, from php you change that value of that database field or that file, so the cron knows that it has to run.
- Have the php launching the sh, but in an asynchronous thread, so the php executiom ends inmediately, informs the user that the data is being processing, and the actual process is running in background
This is a really common approach, you can see it in many webs, for example in your server if you request some heavy stuff, they generate it and notify you when it is ready. Or in facebook when you want a backup of your data, the do the same. Also business inteligence systems, and those analyzing data, use to have background process than generate precalculated data so user processes can be lightweight.
At enterprise level application, what i use to do with this kind of heavy process is to have a link for acessing the last generated data (informing about its date), a process running frequently, and a link to force a new process to be launched right now.
With this, you get a more reliable system, and also, minimize the chances of have two users analyzing the same data in different computers at the same time, which means two process of 10 minutes to get the same data.
UPDATE
If you dont really need to let the user see the result of the process in real time, then its even easier. i have to say that a like crons since the help a lot to keep your process in order and under control when the system grows. But if you dont want it at this moment, you can try to launch a background process directly from php. Try this code (i used it many years ago, but i guess it will work). Is good for both unix and windows servers.
/**
* execute $cmd in the background without PHP waiting for it to finish
*/
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
so, to start your command, you have to do:
<html lang="en">
<head>
<title>Data_analysis</title>
</head>
<body>
<?php
echo " data analysis script started ";
echo execInBackground('sh /Users/Data/myproject/run_virt_da_py.sh');
echo "data analysis script has been started";
?>
</body>
</html>
Of course, since it is running in a background thread, you dont directly know when the process is completed, so you have to include some adicional actions at the end of the sh script, such as launching another php with thenotification process you could need, such as sending an email, updating a database, or so.