0

I have a big json file containing an array of objects. The file is too big (15MB) to be parsed by json_decode. How could I split this array-file into multiple array-files?

The array contains objects, which may also contain objects within it.

5
  • Does json_decode() crash? Commented Jul 30, 2018 at 14:29
  • 1
    I found this for you stackoverflow.com/questions/4049428/… Commented Jul 30, 2018 at 14:30
  • memory limit exceeded. Commented Jul 30, 2018 at 14:30
  • @MaciejKrawczyk Increase memory limit in php.ini? Commented Jul 30, 2018 at 14:39
  • PHP limit did it. I didn't want to do it, because it's not an environment - independent solution, but will work for my case I guess. Commented Jul 30, 2018 at 14:46

2 Answers 2

1

you can try array chunking -

$halved_array = array_chunk($original_array, ceil(count($original_array)/2));
Sign up to request clarification or add additional context in comments.

1 Comment

If I could have $original_array I would be golden, but json_decode chokes on it the text data.
0

You can't chuck the file without having the whole string in memory at any time. You may need to work on the source of the problem (if you can) instead of trying to deal with it. However you can setup a cron than will have for only job to chuck this JSON file into multiple JSON files.

2 Comments

I guess there should be a way. If we could parse the file backwards, character by character. Find matching [ and cut the parts after some amount of data has been parsed and save it to another file.
It would not be reliable at all, i highly discourage you to parse a json file file outside the JSON extension scope.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.