29

I have just inherited a site with a PHP script that is consistently running out of memory at 117 MB. This happens even when I increase PHP's memory_limit variable to 312 MB, which I'm doing via php.ini.

This is now solved thanks to a great clue from pcguru. See my answer below that begins: I have finally found the answer

ini_get('memory_limit') returns the value set in php.ini, so I'm certain Apache has restarted after changing the value. I'm using memory_get_usage(true) to return the memory consumed by the script at various points along the way. And it's consistently failing when it gets to 117 MB.

Is there some internal PHP limit I'm unaware of that has it never allocate more than 117MB to an individual script?

The server has 1GB of RAM and is running CentOS. I have root shell access. PHP is version 5.3.18. MySQL is version 5.1.66-cll.

This script is behind a username/password and I can't provide public access to it.

Edited to Add:

1) Thanks all for your help to date. You'll find more info in my replies to specific user comments under various answers below.

2) Suhosin is definitely not installed. I've checked in multiple places including running a script and check for constants and running php -v

3) The apache log has no record of the specific error message I'm getting. Logging is switched on in php.ini. I piped through grep to search the entire log.

4) Is it possible the wrong error is being reported in this case?

19
  • 9
    What exactly is the error that you see? Perhaps you're confused by the "tried to allocate xxx bytes" Commented Dec 19, 2012 at 15:42
  • 4
    Kudos for having an up-to-date PHP 5.3 version. Commented Dec 19, 2012 at 15:42
  • 2
    Do you have the suhosin patch by any chance? Commented Dec 19, 2012 at 15:43
  • 1
    forums.cpanel.net/f5/php-memory-limit-problem-69715.html Commented Dec 19, 2012 at 15:45
  • 4
    I know this is off-topic, but what are you doing with PHP that needs to allocate 300mb of RAM? That's quite a lot more than your typical PHP script. There are plenty of perfectly good reasons for needing that kind of RAM, so I'm not trying to critisise, but I would be interested to know more, as I've seen a lot of cases where people have written PHP programs to load vast amounts of data into memory when it was more efficient (and usually quicker too) to only load a bit of the data at a time. Commented Dec 19, 2012 at 15:51

5 Answers 5

25

I have finally found the answer. The clue came from pcguru's answer beginning 'Since the server has only 1 GB of RAM...'.

On a hunch I looked to see whether Apache had memory limits of its own as those were likely to affect PHP's ability to allocate memory. Right at the top of httpd.conf I found this statement: RLimitMEM 204535125

This is put there by whm/cpanel. According to the following webpage whm/cpanel incorrectly calculates its value on a virtual server... http://forums.jaguarpc.com/vps-dedicated/17341-apache-memory-limit-rlimitmem.html

The script that runs out of memory gets most of the way through, so I increased RLimitMEM to 268435456 (256 MB) and reran the script. It completed its array merge and produced the csv file for download.

ETA: After further reading about RLimitMEM and RLimitCPU I decided to remove them from httpd.conf. This allows ini_set('memory_limit','###M') to work, and I now give that particular script the extra memory it needs. I also doubled the RAM on that server.

Thank you to everyone for your help in detecting this rather thorny issue, and especially to pcguru who came up with the vital clue that got me to the solution.

Sign up to request clarification or add additional context in comments.

1 Comment

This put me on the right path to resolve my issue, so a big thank you for sharing your info! In WHM, these settings can be adjusted under "Apache Configuration > Memory Usage Restrictions". Documentation here: docs.cpanel.net/whm/service-configuration/…
10

Since the server has only 1 GB of RAM I'm leaning towards the possibility that you have actually run out of system memory entirely.

See this thread. You get the same "PHP Fatal error: Out of memory" instead of the more common "Fatal error: Allowed memory size of ...". Your error indicates the system cannot allocate more memory at all, meaning even PHP:s internal functions cannot allocate more memory, let alone your own code.

How is PHP configured to run with Apache? As a module or as CGI? How many PHP processes can you have running at the same time? Do you have swap space available?

If you use PHP as a module in Apache, Apache has a nasty habit of keeping memory that the PHP process allocated. Guessing since it can't restart the PHP module in the worker, just restart the worker entirely. Each worker that has served PHP simply grows to the PHP memory limit over time as that worker serves a script that allocates a lot of RAM. So if you have many workers at the same time, each using 100 MB+, you will quickly run out of RAM. Try limiting the number of simultaneous workers in Apache.

4 Comments

Thanks pcguru - this makes sense and fits the facts. To answer your questions: PHP is run as a module. There is swap space available and being used - it's a virtual server running 8 sites. There are some largish DB tables - biggest has 2.4 million records. There are plenty of site visitors, and this report is one of dozens their staff could be running. Your explanation also answers another puzzle, which is why did the report succeed in the wee small hours the night before (when I thought I'd fixed the problem :-). I'm about to watch it run with top in another window.
running top shows the system doesn't running out of physical RAM while the query is running. It dropped to around 60MB just before crashing, but still had 1.6GB of swap space. This was a great suggestion and it may have lead me to the actual answer, so thanks for that. I'm just testing now.
Good to hear it, there are ways to reduce memory usage with Apache/PHP but it generally comes with lower performance as a result. I hope you get it working. Otherwise one can always buy more RAM for the server. :)
Heh - funny you should say that. The client has just agreed to double the ram and add another cpu. It's a virtual so stop the instance, click a few buttons, and restart. I love virtuals! Hopefully Cpanel/WHM won't come along and clobber my edit to httpd.conf.
4

This may not be the answer to your problem, but if you run PHP from the command line you can overrite the memory limit from php.ini.

php -d memory_limit=321M my_script.php

I'm not exactly sure what the default memory limit via cli is.

Also, you can run php --ini and check the results.

3 Comments

php --ini returns:Configuration File (php.ini) Path: /usr/local/lib Loaded Configuration File: /usr/local/lib/php.ini Scan for additional .ini files in: (none) Additional .ini files parsed: (none)
Also - I can't run the script from the prompt because the part that crashes requires several post variables. Unfortunately it's quite complex, using massive arrays for reasons that make no sense to me. In my view it should be using temporary tables and just sending the final result to PHP. But in the meantime, if I can give it 198MB it will complete which is a useful temporary fix.
Understand that. Really strange behaviour.
1

This isn't an answer to why your script is dying after a certain memory usage, but you can get around it by removing the memory limit entirely within the PHP script itself:

ini_set('memory_limit', '-1');

This is dangerous. If you have a runaway script, PHP will take memory until your server has none left to allocate and falls over. So you should only use this if you're sure the script itself isn't a problem, and only to test the output.

As to if PHP has some per-script limit on memory usage, no. I have personally run scripts with near 1GB memory usage.

2 Comments

Thanks M_user. I can't run the risk of giving PHP carte-blanch. The client's sites are accessed internationally so there isn't convenient downtime. Useful to know PHP isn't imposing a limit - thanks for that. Is it possible the error is being miss-reported?
That's not a good advice. I'd say, optimize your app to use less memory instead of allowing unlimited memory use. If a script is dying before the memory limit is reached, it wouldn't matter how much you increase it, the script will still die.
-7

You have an infinite loop somewhere in your code.

3 Comments

How can you say that without looking at the code. It's a best guess that shouldn't be posted as an answer.
If raising the memory limit does not help this is the second stuff which can cause this type of an error. Anyone can try it whith an infinite loop or with a bad recursion function.
It's not an infinite loop. That would require an infinite array! It's consistently crashing at 117MB despite (supposedly) having more RAM available. What's more, the loop in which the crash occurs is doing an array merge. Each iteration nibbles around 79 more bytes.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.