3

I have a webserver running NGINX & PHP, with a very basic multi client test.

<?php
    if(isset($_GET['query'])) {
        echo "HELLO MY NAME IS WEBSERVER";
    }
    if(isset($_GET['sleep'])) {
        sleep(10);
    }
?>

If I run http://servername.com/index.php?query, I get an instant response.

If I run ?sleep then ?query together, ?query appears to be queued till ?sleep is complete.

This happens across multiple clients. Client A can request ?sleep, which will affect Client B's ?query request. Client B is a completely different machine.

Is there any method of tweaking php.ini or my nginx config to allow a separate php worker process to spawn (or something along those lines?)

Edit: For a little background, here's my config.

nginx.conf:

    location ~ \.php$ {
            fastcgi_pass   127.0.0.1:9123;
            fastcgi_index  index.php;
            fastcgi_param  SCRIPT_FILENAME  $document_root$fastcgi_script_name;
            include        fastcgi_params;
    }

fastgci_params:

fastcgi_param  QUERY_STRING       $query_string;
fastcgi_param  REQUEST_METHOD     $request_method;
fastcgi_param  CONTENT_TYPE       $content_type;
fastcgi_param  CONTENT_LENGTH     $content_length;

fastcgi_param  SCRIPT_NAME        $fastcgi_script_name;
fastcgi_param  REQUEST_URI        $request_uri;
fastcgi_param  DOCUMENT_URI       $document_uri;
fastcgi_param  DOCUMENT_ROOT      $document_root;
fastcgi_param  SERVER_PROTOCOL    $server_protocol;
fastcgi_param  REQUEST_SCHEME     $scheme;
fastcgi_param  HTTPS              $https if_not_empty;

fastcgi_param  GATEWAY_INTERFACE  CGI/1.1;
fastcgi_param  SERVER_SOFTWARE    nginx/$nginx_version;

fastcgi_param  REMOTE_ADDR        $remote_addr;
fastcgi_param  REMOTE_PORT        $remote_port;
fastcgi_param  SERVER_ADDR        $server_addr;
fastcgi_param  SERVER_PORT        $server_port;
fastcgi_param  SERVER_NAME        $server_name;

# PHP only, required if PHP was built with --enable-force-cgi-redirect
fastcgi_param  REDIRECT_STATUS    200;

php execution (runphp.bat):

set PATH=%cd%\php;%PATH%
start %cd%\php\php-cgi.exe -b 127.0.0.1:9123

edit 2: Ok, so it appears I need PHP-FPM, which is not available on windows:

It is important to note that FPM is not built with the windows binaries.  Many of the guides you may find online rely on php-cgi.exe.  Unfortunately they call it FPM but this is incorrect!

The executable php-cgi.exe that is bundled with the windows binaries is a FastCGI interface but it is *not* FPM (Fastcgi Process Manager).  php-cgi.exe does not have multi-threading or concurrent request support, nor support for any of the FPM configuration options.

So, as a workaround, I'm trying the multiple php servers / processes approach:

upstream php {
    server  127.0.0.1:9000;
    server  127.0.0.1:9001;
    server  127.0.0.1:9002;
    server  127.0.0.1:9003;
}

location ~ \.php$ {
    fastcgi_pass   php;
    fastcgi_index  index.php;
    fastcgi_param  SCRIPT_FILENAME  $document_root$fastcgi_script_name;
    include        fastcgi_params;
}

However, NGINX will not start at all in this configuration. It doesn't seem to want to accept any "upstream php {}"

Any ideas?

Thanks

2
  • How exactly is PHP integrated into nginx? PHP-FPM? How many FPM workers are configured? Commented Jul 25, 2018 at 8:34
  • How many cores does the server with php-fpm have? How many php-fpm child processes are created? You shouldn't be experiencing this behavior, I can't replicate it on my setup. Commented Jul 25, 2018 at 8:40

3 Answers 3

6

As per the edits, I figured PHP-FPM isn't available in Windows. However, this can be bypassed by spawning multiple PHP processes on different ports, and configuring NGINX to load balance across them.

My "RunPHP.bat" script:

set PATH=%cd%\php;%PATH%
runhiddenconsole.exe %cd%\php\php-cgi.exe -b 127.0.0.1:9100
runhiddenconsole.exe %cd%\php\php-cgi.exe -b 127.0.0.1:9101
runhiddenconsole.exe %cd%\php\php-cgi.exe -b 127.0.0.1:9102
runhiddenconsole.exe %cd%\php\php-cgi.exe -b 127.0.0.1:9103

My nginx.conf (php bits only):

http {

    upstream php_farm {
        server 127.0.0.1:9100 weight=1 max_fails=1 fail_timeout=1s;
        server 127.0.0.1:9101 weight=1 max_fails=1 fail_timeout=1s;
        server 127.0.0.1:9102 weight=1 max_fails=1 fail_timeout=1s;
        server 127.0.0.1:9103 weight=1 max_fails=1 fail_timeout=1s;
    }

    server {
        location ~ \.php$ {
                fastcgi_pass   php_farm;
                fastcgi_index  index.php;
                fastcgi_param  SCRIPT_FILENAME  $document_root$fastcgi_script_name;
                include        fastcgi_params;

    }

}
Sign up to request clarification or add additional context in comments.

2 Comments

On Windows you'll run into so many stupid problems, one of them being pipelining where requests get queued even though they should be parallel. Your question didn't contain any reference to Windows. Running a Linux VM with nginx + php-fpm will produce expected results. Using Windows to do any sort of proper development with PHP - won't.
I would give you a thousand votes if I could for this solution. I'm hosting on Linux Android (ARMv7l) and I have a problem with the PHP-CGI module (7.3.3) which is compiled for one instance only, ie it doesn't recognize the Process Manager directive (pm.max_children, etc.). I am now running 4 PHP processes of this module on different ports, while Nginx is balancing incoming connections. Moreover, when one of the PHP processes is self-restarted, the incoming connection is not rejected, because Nging (1.10.1) connects to the next valid process. 1000+ votes for your answer.
-1

It looks you misunderstand the concept of request flow along HTTP/Nginx/PHP. let me explain it:

  1. HTTP is stateless protocol. In your case you need to know that there's no chance to send some content to client, wait some time (sleep), and send again some content.
  2. If you have issues with requests that blocks other requests, you need to spawn more PHP-FPM workers which will handle many simultaneous requests.
  3. There is a way to send some content to the client, sleep, and run some other PHP code. It requires to send a special event which will close connection between client and server, but the PHP worker will not finish it's work. This is how Symfony does it's background tasks after response is send to the client.

For now you need to tweak your config. Take carry of these two parameters of Nginx config:

worker_processes  1;
worker_connections  1024;

First one shows how many Nginx workers are running, second one defines how many connections it can try to handle per one second (basically).

After, please do some tweaks around PHP-FPM config. Look at these params:

pm = dynamic; allows FPM to manipulate number of FPM workers
pm.max_children = 5; defines how many workers run at maximum state
pm.start_servers = 3; defines how many workers run at minimum state
pm.min_spare_servers = 2; defines how many workers run as idle on minimum
pm.max_spare_servers = 4; defines how many workers run as idle on maximum
pm.max_requests = 200; defines how many requests per second should be handled by one worker

Basically that's all you need. Now you have to experiment with all those params to find the best configuration for your case.

2 Comments

You didn't read the question properly and you completely misunderstood what happens.
I have set worker_processes to 2, and have also copied and pasted that "PHP-FPM" config into my php.ini file (I don't believe there is any FPM config file?) The issue still persists. Client A browses to servername.com/index.php?sleep Client B browser to servername.com/index.php?query Client B must still wait for Client A's 10 second sleep to finish. Client B appears to be queued behind Client A
-1

https://github.com/deemru/php-cgi-spawner

php-cgi-spawner is the smallest and easiest application to spawn a multiple processes in Windows for your web server with FastCGI.php-cgi

1 Comment

It's a nice alternative solution suggestion, but his question was how to get his niginx server config working

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.