16

I'm using a PHP proxy to get the contents of a file. I want to search through that file using the powerfull jQuery options, without having to write all kinds of queries in PHP. Here is my PHP code:

$page = file_get_contents( filter_var( $_POST[url], FILTER_SANITIZE_URL ) );
die( json_encode( $page ) );

If the page loaded gets too big PHP will read the entire document, but json_encoding it will only give the first part of the file, not the entire file. I can't find anything about a size limit on json passed data, but apparently there is one.

the question: is there a workaround to prevent only part of the file being transfered?

I need to grab files from other domains, so reading the contents of a file in jQuery is not really an option.

7
  • The question is not clear. The size of data possible to be converted by json_encode is only limited by available memory. Commented May 31, 2011 at 22:47
  • Can you give us last signs of the JSON that was created and sent to the browser? Commented May 31, 2011 at 22:48
  • 1
    Are you sure you're not just hitting a time limit (default 30 seconds)? Try set_time_limit(0). Commented May 31, 2011 at 22:48
  • Why do you want to JSON-serialize it anyway if $page is just a string? Commented May 31, 2011 at 22:51
  • 2
    @Frits van Campen: with time limit hit he will not receive anything, not just a part. Commented May 31, 2011 at 22:51

3 Answers 3

31

To help others who may be running into problems that they can't explain with json_encode. I've found it helps to know about the json error msg function.

json_last_error_msg();

I was having a similar problem but it wasn't related to file size. I had malformed utf-8 in the database. You can check your json like this

$json = json_encode($data);

if ($json)
    echo $json;
else
    echo json_last_error_msg();

PHP docs here json_last_error_msg

Sign up to request clarification or add additional context in comments.

4 Comments

Awesome! I'd never figured this out if I didn't come across your answer...
DUDE, this is great! You saved me bunch of debugging time!
I was able to fix this issue with json_encode($data, JSON_INVALID_UTF8_IGNORE)
@MarkAinsworth your suggestion worked for me too.
13

PHP 5.3: ext/json/json.c
PHP 7 (current): ext/json/json.c

There is no built-in restriction to the size of JSON serialized data. Not for strings anyway. I would therefore assume you've run into PHPs memory limit or something.

json_encodeing a string consistently just adds some escaping and the outer double quotes. Internally that means a bit memory doubling (temporary string concatenation and utf8_to_utf16 conversion/check), so that I ran into my 32MB php memory limit with an 8MB string already. But other than that, there seem to be no arbitrary limits in json.c

4 Comments

the size was way smaller than 8Mb. I 'fixed' it (it's just a workaround) by not importing things as JSON. It's not the PHP memory limit, otherwise I couldn't have 'fixed' the problem by going around JSON
@patrick it would be worth mentioned how you 'fixed' it and your (workaround) to help others in the future visiting here.
Like I said above: I fixed it by not encoding
... to elaborate on 'not encoding'... I used: exit( $page ) at the end, instead of json_encode( $page )... things didn't need to be encoded in this AJAX application, the initial question still remained though...
0

I resolved similar case. So, I write here.

My website also responded broken jsondata. It is about 8MB.

The cause was the storage has not free. The result of command of df was displayed "used 100%".

I deleted some files. Then, resolved.

1 Comment

Worth checking indeed... in my case this was not the problem, I still would like to know where the problem I ran in to came from. It was not a memory issue, there where no errors. Still the problem was there...

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.