1

I need help with encoding large array into JSON using PHP's json_encode(). The array is from a local Database. I am using JS to parse JSON. However, the PHP script that handles array to JSON format stops if the dataset is too big. (i.e 100,000 Results). I tried to up my memory_limit to -1 and still does not help. Is it possible to encode a big array using PHP's json_encode()?

Please let me know if my question is too confusing.

Thank you all!

8
  • The problem with JSON_ENCODE in php is, it tends to add double quotes and escaping sequences which would increase the actual size of the JSON being imported, Why can't you handle this at client side? Commented Aug 23, 2016 at 18:09
  • @DavidR JSON format is needed because it will be used by others (theoretically). Commented Aug 23, 2016 at 18:45
  • @PaulCrovella Just Blank page. The page just stopped. Commented Aug 23, 2016 at 18:46
  • Some code samples may help. How are you passing the JSON between PHP and JS? Commented Aug 23, 2016 at 21:07
  • Does it really make sense to return such large results? Wouldn't you be better of using pagination? The client browser probably won't like the size of the dataset either. Commented Aug 23, 2016 at 23:36

2 Answers 2

6

If each of the rows in your array are successfully encoded by json_encode, then take advantage of that fact by only encoding each row and echo'ing out the array structure yourself.

That is

$prefix = '';
echo '[';
foreach($rows as $row) {
  echo $prefix, json_encode($row);
  $prefix = ',';
}
echo ']';

If the nesting is more complex, then a bit more detailed technique needs to be employed, but this technique saved me when I encountered the problem.

Sign up to request clarification or add additional context in comments.

1 Comment

Fancier way: $largeJson = '['.implode(',', array_map(function($item) { return json_encode($item); })).']'; :)
0

The acceptable answer is this

$prefix = ''; 
echo '[';  
foreach($rows as $row) {
 echo $prefix, json_encode($row);  $prefix = ','; 
} 
echo ']';

posted above.

However, it is still not guarantee to work if the dataset is too big.

The possible solutions are Limit the Query. Instead of using SELECT *. Just select which rows you need BUT in my case, I have no absolute control of the query. So this solution is not gonna work for me .

Last one is to upgrade our server's memory or even the server itself, to keep up with the continuously growing dataset.

Thanks all for your help!

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.