4

I have written a PHP script to import large amount of data. The import process is triggered through Ajax call and the ajax request keep on waiting for the server response. As I am working on a dedicated server so there is no issue of timeout.

The problem is that we require a feature by which we can terminate the import process. For example a stop button on client-side. We thought that if we had killed the waiting ajax call then the process on the server will also stop as there is no request to serve. But unfortunately that is not the case, the script keeps on executing on server side while the Ajax Request is already killed from client.

Secondly, we use PHP session in this project. Let say if the cancel button requires an Ajax call to another script on the server to stop the process then How could that request will reach server if there is already a waiting ajax request. Php/Apache will hold the second request until the first request cycle is completed.

Note: As per our project architecture we require session_start() on every page. It will be good if anyone can guide on these issues.

3
  • You didn't ask a question. Commented May 13, 2013 at 12:16
  • @hobbs read the last line "It will be good if anyone can guide on these issues" Commented May 13, 2013 at 13:44
  • 3
    second point: PHP locks the session file while it's actively used by a script. Unless you session_write_close() to release the lock, only one session-using PHP script can be in-flight at any one time. Commented May 13, 2013 at 14:02

6 Answers 6

1

You can write the Process ID at the start of running the script to a file, the db or a cache. You can get the Process ID with http://php.net/manual/en/function.getmypid.php. This assumes each script has its own Process ID.

The kill script (not using the locked session) could read that Process ID and try to kill it.

Sign up to request clarification or add additional context in comments.

4 Comments

Are you sure, when a request reaches server it creates only one process?
@asim-ishaq I don't understand your question. But as I said, this assumes each script has its own Process ID. If I'm not mistaking, you can also configure your web server to handle all requests in the same process. But I don't think that is the most used option. What I always encounter is that the web server runs (forks) each request in its own process.
Actually I am testing another idea. Still not working. The idea is: user cancels ajax request (done). The PHP script on server will keep on checking if the connection is aborted then terminate the process. I am using php function connection_aborted() to determine the status of connection. It will return 1 if it is terminated otherwise 0. The problem is that it always returns 0 whether the request is terminated or not. Do you have any IDEA? see php.net/manual/en/function.connection-aborted.php
@asim-ishaq That's a good idea. See this question/answer for more background on the connection status.
1

Be careful while doing long running processes in PHP as PHP's zend engine & GC is not suited for long running processes.

So, I strongly suggest using a proper job manager like gearman. gearman does have a php extensions. Using a job manager will give you full control over each process. you can start/stop processes & taks.

Another option is to use a queue, like amqp, to handle these tasks more cleanly. Which one is more suitable for your use case, I'll let you decide.

Comments

0

How about setting a time limit slightly higher than your AJAX timeout on your PHP script? This should kill the PHP script if it runs over time. Something similar to this:

<?php

    set_time_limit(20);

    while ($i<=10)
    {
        echo "i=$i ";
        sleep(100);
        $i++;
    }

?>

Source: http://php.net/manual/en/function.set-time-limit.php

1 Comment

there should be exactly no timeout for PHP scripts, thats why we are using a VPS server. Just give a feature to end user to terminate a process he has initiated on server.
0

Once a script runs it can only be stopped by ending the php process working on the script. One possibility would be to use the session to store a "continue" condition when another script is called.

For example:

Script 1 is the worker (importer)
Script 2 is a function called repeatedly by ajax as long as the importer shall work.
Script 1 and 2 share let's say $_SESSION['lastPing'].

so

Script 2 sets $_SESSION['lastPing'] = time(); on each call.
Script 1 has a condition if($_SESSION['lastPing'] - 30 > time()){ die(); }

1 Comment

As the application uses PHP SESSION and the First request is in process so the second request cannot be processed util the first cycle is completed. So if we do make another ajax call, it will wait until the earlier request completes. This is the limitation of scripts using PHP session.
0

You might be able to handle this using the proc_ functions. A possible outline of the steps:

  1. Create a unique token and pass it to the server along with the order to begin the import.
  2. When order to import is received (via AJAX):
    a) Store a record showing that this token is being processed (via file, db, memcached).
    b) Run the import script using proc_open.
    c) Begin a polling loop, checking the record from (2a) to see that this token is still in processing status. If not, call proc_terminate and exit the loop.
  3. If the order to stop import is given (in a separate AJAX call), update the persisted record to indicate that the particular token should be stopped, which will be picked up in (2c)

Comments

0

Goto the following link for the exact solution to your problem:

PHP auto-kill a script if the HTTP request is cancelled/closed

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.