0

I have a table with around 100000 records ,the structure is shown below

id  | name    | desc   | length   | breadth | -------------|remark  //upt o 56 fields
1     FRT-100  -desc-       10      10                      remarking
----------------------------------------------------------------
----------------------------------------------------------------

what iam doing this is using a cronjob(Gearman) to write all these data to a csv ,my code is given below

        <?php 
      set_time_limit (0);
     ini_set("memory_limit","2048M");
        //get the total count of records,so that we can loop it in small chunks
        $query = "SELECT COUNT(*) AS cnt FROM tablename WHERE company_id = $companyid";

            $result = $link->query($query);
            $count = 0;
            while ($row = mysqli_fetch_array($result)) {
                $count = $row["cnt"];
            }
            if ($count > 1000) {
                $loop = ceil($count / 1000);
            } else {
                $loop = 1;
            }


        // im going to write it in small chunks of 1000's each time to avoid time out

            for ($ii = 1; $ii <= $loop; $ii++) { 

                if ($ii == 1) {
                    $s = 1;
                } else {
                    $s = floatval(($ii * 1000) - 1000);
                }

                $q = "SELECT * FROM datas WHERE group_company_id = $companyid  LIMIT 1000 OFFSET $s";
                $r = $link->query($q);       

        while ($row2 = mysqli_fetch_array($r)) {
                    //my csv writing will be done here and its working fine for records up to 10,000 ~ 12,000 after than memory exhaustion occours
        }
}
    ?>

I strongly suspects something can be optimised in the offset function of mysql .Can someone show me a better way to optimise it ? open to any suggestions (CRON,third party libraries ..etc)

7
  • while ($row = mysqli_fetch_array($result)) { --- why do you need a loop here? Commented Jan 13, 2014 at 4:22
  • @zerkms to get the count of how many records ..since its a single row..yes you are right Commented Jan 13, 2014 at 4:24
  • @zerkms can you put some light on the limit,offset issue Commented Jan 13, 2014 at 4:24
  • @zerkms $count = $row[0]["cnt"]; like this Commented Jan 13, 2014 at 4:28
  • $count = $row["cnt"]; Commented Jan 13, 2014 at 4:31

1 Answer 1

2

Try and avoid storing everything in memory at once, instead load each row, then write out result one row at a time.

<?php
$q = "SELECT * FROM datas";
$r = $link->query($q);       
$fp = fopen("out.csv","w+"); 
// Or you could just set the headers for content type, and echo the output
while ($row2 = mysqli_fetch_array($r)) {
  fwrite($fp, implode(",",$row2)."\n");
}
fclose($fp);

This should solve the issue, nothing is being stored in memory.

Sign up to request clarification or add additional context in comments.

1 Comment

This is actually not 100% accurate, I have since found out that you also need to disable the mysql client from buffering the whole result into memory. To achive this you can use mysql_unbuffered_query($q) or if using PDO $pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.