0

I am building on from this code here where how to Convert Data from MySQL to JSON using PHP is shown. I have got this to work for a subset of my database(39,000 rows) but not the whole database(it basically just hangs getting 500 Internal Server Error in the Net tab in firebug and nothing is written in the files). The code basically writes the DB to 2 files to a json file and a js file with var = <json array>

here is a snipet of what works:

//$sql = "select * from table1; //does not work
//$sql = "select * from table1 LIMIT 0,10"; //for test
$sql = "select * from table1 LIMIT 0,39000"; //works 39000ish is the max

Is there any way I can alter my code to handle this bigger DB of 300,000+ rows and counting?

Here is my complete code:

  <?php
    /* Database connection start */
    $servername = "localhost";
    $username = "root";
    $password = "Password1";
    $dbname = "test";

        //open connection to mysql db
        $connection = mysqli_connect($servername, $username, $password, $dbname) or die("Error " . mysqli_error($connection));

        //fetch table rows from mysql db
        //$sql = "select * from table1; //does not work
        //$sql = "select * from table1 LIMIT 0,10"; //for test
        $sql = "select * from table1 LIMIT 0,39000"; //works 39000ish is the max
        $result = mysqli_query($connection, $sql) or die("Error in Selecting " . mysqli_error($connection));

        //create an array
        $emparray2 = array();
        while($row =mysqli_fetch_assoc($result))
        {
            $emparray2[] = $row;
        }
        echo json_encode($emparray2);

        //close the db connection
        mysqli_close($connection);

        //write to json file
        $fp = fopen('empdata2.json', 'w');
        fwrite($fp, json_encode($emparray2));
        fclose($fp);

        //write to js file with var data = [ <ARRAY>]
        $fp = fopen('data2.js', 'w');
        fwrite($fp, "var data2 =");
        fwrite($fp, json_encode($emparray2, JSON_PRETTY_PRINT));//makes it pretty
        //fwrite($fp, json_encode($emparray2));
        fclose($fp);

    ?>

possible similar questions here:

1 here 2 here 3 here 4 here 5 here

Edit1

For my reference this is that I have to change.

root@a4b8b0a15197:/# cat /etc/php5/apache2/php.ini | grep -n memory_limit
406:memory_limit = 128M
root@a4b8b0a15197:/#
12
  • I often ask this but what does "does not work" actually mean? :) Commented Jul 4, 2016 at 3:31
  • But what are you doing that for? Commented Jul 4, 2016 at 3:32
  • Do you use whole 300,000+ data? I can't believe this. I suggest you data pagination. Commented Jul 4, 2016 at 3:37
  • 1
    Do you have access to the server? You may need to increase some settings in php.ini to allow for a longer processing time. Commented Jul 4, 2016 at 3:54
  • 1
    See @AndyRyu's answer below :) Commented Jul 4, 2016 at 4:29

1 Answer 1

1

You can use this function.

<?php
set_time_limit ( 0 ) // no limit execution time.

/* Database connection start */
$servername = "localhost";
$username = "root";
$password = "Password1";
$dbname = "test";

And, you have to increase memory limit like THIS

Sign up to request clarification or add additional context in comments.

2 Comments

Is memory_limit the max size of the file to be written or the size of the data from the db? tks
@HattrickNZ No. Please be generous enough to hold the size. First, I recommend that you start from 200 megabytes.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.