When my application tries to decodes large (15K~ rows) JSON string (comes from CURL), it fails with:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 91 bytes)
I know I can expand the memory limit or unlimit it, however I'd rather avoid that. I have been wondering whether there is a different approach to address that kind of issue - such splitting the JSON string into small chunks (like array_chunk).
UPDATE
Just to make sure that the issue is not caused by any other function / loop in the app, I've extract the JSON string into a file and tried to decode it directly from the file (file size = 11.8MB). still fails.
$y = json_decode( file_get_contents('/var/tmp/test.txt') );
UPDATE 2 The script runs on Mac OS X environment. I've tested it also on Ubunto env (also 128M memory limit) - and there it works perfectly. should i be concern?