1

I'm running into an issue with my shared hosting provider and my 128Mb memory limit. I'm plugging data from AJAX into a PHP controller that gets a file, parses the new data from AJAX and overwrites that file with the original content + new data. We're dealing with a JSON file.

The host has bumped me up all they can at this point. The JSON reaches about 16Mb in size before crap hits the fan. If we have any suggestions as to more efficiently consolidate the data I'd greatly appreciate it. I am bashing the memory limit to -1 currently - I'd love to avoid doing this!!

Here's my code

<?php

function appendCurrentFile($payload) {

$_POST['payload'] = null; // CLEAR post payload

ini_set("memory_limit",-1); // Fixes the problem the WRONG WAY

$files = glob('/path/*'); // Get all json files in storage directory

natsort($files); // Files to array

$lastFile = array_pop($files); // Get last file (most recent JSON dump)

$fileData = json_decode(file_get_contents($lastFile)); // Load file contents into it's own array

$payload = json_decode($payload); // Take the POST payload from AJAX and parse into array

$newArray = array_merge($fileData->cards, $payload); // Merge the data from our file and our payload

$payload = null; // Clear payload variable

$fileData->cards = $newArray; // Set the file object array as our new, merged array

$newArray = null; // Clear our merged array

file_put_contents($lastFile, json_encode($fileData, JSON_UNESCAPED_SLASHES)); // Overwrite latest file with existing data + new data.

echo 'JSON updated! Memory Used: '.round((memory_get_peak_usage() / 134217728 * 100), 0).'%'; // Report how badly we're abusing our memory

}
?>
6
  • stackoverflow.com/questions/4049428/… Commented Apr 14, 2017 at 23:39
  • You grab the last file, add new data to it, write a new bigger file, and repeat ad nauseum? Maybe your problem isn't memory allocation. Commented Apr 14, 2017 at 23:51
  • Please elaborate. Commented Apr 14, 2017 at 23:58
  • I get that I have a lack of understanding @AlexHowansky - if that's what you're specifically pointing out :) Commented Apr 15, 2017 at 0:17
  • Heh heh no, I'm merely suggesting that maybe a more appropriate solution would be to implement a design that didn't rely on an ever-growing text file. E.g., maybe it's time to start thinking about a database. Commented Apr 15, 2017 at 1:25

1 Answer 1

1

Used fopen('file.json', 'a+'), and some string manipulation instead. WAY less memory load. About 1% each time total instead of 1-3% more memory usage each poll (from my maximum memory allowed). Edit: Thanks for the responses!

$files = glob('/path/*'); // Get all json files in storage directory

natsort($files); // Files to array

$lastFile = array_pop($files); // Get last file (most recent JSON dump)

$file = fopen($lastFile, 'a+'); // Stream file with write access, put cursor at end

fwrite($file, $payload); // Insert json chunk into end of file. 

fclose($file); // Close file

echo 'JSON updated! Memory Used: '.round((memory_get_peak_usage() / 134217728 * 100), 0).'%'; // Report how badly we're abusing our memory (Now  only 1% of total memory each time instead of 1-3% incremented per poll.
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.