0

I have a web-app running on a paid VPS. No problem there. I'm moving this app to my own dedicated server.

  • Current Cloud Server -CS-: Centos 6 x86_64; 2 Gb Ram; 2 vCPU

  • Virtual on Dedicated Server: Centos 7 x86_64; 2 Gb Ram; 2 vCPU

I deployed the PC with the same specs because "if it works okay with that it should work with the same".

On a API's endpoint the current CS returns the correct json. The new Server returns:

Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 4294967296 bytes) in /var/www/api/Components/Database.php on line 439

line 439 is:

call_user_func_array(array($stmt, 'bind_result'), $parameters);

The search results I found here and there were not helpful. Some said upgrade PHP version and 90% of them is set a larger memory limit. ** I did**. I set it to 256M, 512M, 2GB (beyond this there is no ram available), 4GB and 5GB. ** Nada**

This query works ok on the other -and production- server.

New server:

Server: Apache/2.4.6 (CentOS) OpenSSL/1.0.1e-fips mod_fcgid/2.3.9        

PHP/5.4.16 mod_wsgi/3.4 Python/2.7.5 
X-Powered-By: PHP/5.4.16

CS:

 Server: Apache 
 X-Powered-By: PHP/5.5.22

I look at the LENGTH of the base64 being queried. They're 2 images data sent. This size is returned by mysql:

select LENGTH(image_base64) from pmv where pmv.model = 1;

That is the query. It returns 2 row. Image_base64 is LONGTEXT. There are some others columns but it won't add to the issue.

LENGTH(image_base64)
162678
131402

It is clearly not close to 4Gb

I can't access php/apache conf on the CS. The only thing I didn't try yet is upgrading PHP from 5.4 to 5.5. Could be it? I'll try to get access to the server on weekend to try out any other ideas.

Edit #1

I update the PHP Version to 5.6.9.

Same error:

<b>Fatal error</b>:  Allowed memory size of 536870912 bytes exhausted (tried to allocate 4294967296 bytes) in <b>/var/www/api/Components/Database.php</b> on line <b>439</b><br />

Edit #2

Changing the column type from longtext to mediumtext seems to work as in this question

But why on earth I need to change the column type on this server? As far I can test now, no matter how much info is stored on that column. It will give the error as long as it's a longtext col.

Thanks!

7
  • From the look of the failing filename, and the bind_result, it certainly appears to be doing a database query. I would suggest that the size of the data being returned is the base cause of the issues - making this a SQL issue, not PHP. Do you need to return so much data at once? Commented Jun 12, 2015 at 22:23
  • @AlisterBulman mysql returns a base64 of a image to be recreated on client. It real size of the image is ~300K. But PHP is complaining about allocating 4gb Commented Jun 12, 2015 at 22:25
  • Just one? What is the SQL? Commented Jun 12, 2015 at 22:26
  • It sounds like a software bug handling blob column type. I would upgrade the php to see if it goes away. Commented Jun 12, 2015 at 22:30
  • Are you in a class' method ? Or in the global space ? Commented Jun 12, 2015 at 22:31

2 Answers 2

4

As explained in this bug report :

This is a known limitation of ext/mysqli when using libmysql (always in 5.2 and previous) and when libmysql is enabled with 5.3 . The reason is that the server sends not too specific metadata about the column. This longtext has a max length of 4G and ext/mysqli tries to bind with the max length, to be sure no data loss occurs (data doesn't fit in the bind buffer on C level).

ext/mysqli when using mysqlnd is free from this bug.

So to fix that, you have 4 solutions :

  • You can use a text or mediumtext instead of a longblob or longtext to use less memory
  • You can try to enable mysqlnd for your current PHP version
  • You can upgrade your PHP, as it should have mysqlnd enabled by default
  • You can use PDO connector instead of mysqli

Personally, I would go for the third one since having an recent version generally bring you many more advantages if it doesn't oblige you to refactor significant parts of your code.

And an obvious warning:

People, changing the memory_limit by ini_set('memory_limit', '-1'); is NOT a solution at all.

Please don't do that. Obviously php has a memory leak somewhere and you are telling the server to just use all the memory that it wants. The problem has not been fixed at all

Sign up to request clarification or add additional context in comments.

10 Comments

put in perspective, it's over 4GB. and yes you really should fix it at that amount
It does. But the same call to the api on both servers at the same time give different results. It's the same code and database on both ends.
Also is this bug linked to your table types : stackoverflow.com/questions/18121619/… ?
I'll try that. However it does not explain why it already works on other setup.
No sorry I deleted it because that was false : bugs.php.net/bug.php?id=51386. It has not been patched at least until PHP 5.2.13 but maybe later (via mysqli extension).
|
-1

As presented in this SO answer, you could try the following:

ini_set('memory_limit', -1);

You should however attempt to find where the memory is going, it is always best fixing than forgetting!

This is even more relevant in this case, seeing that you have a usage of over 4GB, that is one hell of a memory leak.

6 Comments

Tried a sec ago. It says the same error. And as the machine only has 2gb ram it can allocate. But the query size is not even close to 4GB
There's no leak on the production server and the data being sent are 2 images. IMO its a bug and the error message makes no sense.
Very, very strange. I'll take a look into it when I can, I'll get back to you ASAP.
thanks for looking it up. I'll post more info as I get it. But for now I can't access the server until tomorrow
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.