0

Hello I'm trying to export big table (1Millions row into a CSV file) usgin laravel 5 framework.

With the code pasted below it's not working fine since it allocates too much memory. All results are stored in memory until I've finished to write the file (php exec)

At first I tryed (VERY UNEFFICIENT)

1. Fetching results from DB

$mytable = 'table';
                $filePath = storage_path().'/csv/test.csv';

                $mins =  Carbon::now()->subMinutes(10000);

                // set the fetch Assoc mode temoporarly
                DB::setFetchMode(PDO::FETCH_ASSOC);

                $results = DB::table("$mytable")->where('ts_activity', '>=', $mins)->get();

                // revert the fetch Class mode 
                DB::setFetchMode(PDO::FETCH_CLASS);

2. Write results in file

if(count($results)>0){
                   $headerLine = implode(',',array_keys($results[0]));

                    dump($headerLine);

                    if(!file_put_contents($filePath, $headerLine.PHP_EOL)){
                        throw new Exception("Cannot write FILE: {$filePath}");                    
                    }

                    foreach($results as $row){   
                        foreach($row as $k=>$v){
                            if($row[$k] == '0000-00-00 00:00:00'){$row[$k] = '';}
                            $row[$k] = trim(str_replace(array(',', "\n", "\r"), ' ', $row[$k]));
                        }

                        //ADDs content to file
                        if(!file_put_contents($filePath, implode(',', $row).PHP_EOL, FILE_APPEND)){
                            throw new Exception("Cannot write FILE: {$filePath}");                    
                        }
                    }

                    //check file size created                
                    $fileSize = filesize($filePath);
                    $reportString   = "Created file at: $filePath of ($fileSize)";
                    //report
                    print($reportString);

                    RETURN TRUE;                            
                }

Then I tryed using mysqli directly and everything is fine because every single line is fetched to memory and then written down to file each cycle.

 //conection: 
                $link = mysqli_connect(env('DB_HOST'),env('DB_USERNAME'),env('DB_PASSWORD'),env('DB_DATABASE')) or die("Error " . mysqli_error($link)); 

                //consultation: 

                $query = "SELECT * FROM $mytable WHERE ts_activity>='$mins';" or die("Error in the consult.." . mysqli_error($link)); 

                //execute the query. 

                $result = mysqli_query($link, $query); 
                $count = 0;
                while($row = $result->fetch_assoc()) { 
                    if($count++==0){
                        $headerLine = implode(',', array_keys($row));                        
                        dump($headerLine);
                        if(!file_put_contents($filePath, $headerLine.PHP_EOL)){
                            throw new Exception("Cannot write FILE: {$filePath}");                    
                        }
                        continue;
                    }

                    foreach($row as $k=>$v){
                        if($row[$k] == '0000-00-00 00:00:00'){$row[$k] = '';}
                        $row[$k] = trim(str_replace(array(',', "\n", "\r"), ' ', $row[$k]));
                    }

                    //ADDs content to file
                    if(!file_put_contents($filePath, implode(',', $row).PHP_EOL, FILE_APPEND)){
                        throw new Exception("Cannot write FILE: {$filePath}");                    
                    }
                }

How can I obtain the same efficiency using Laravel 5 classes (DB) ? thanks

1 Answer 1

7

You might want to try this. This will quite obviously stream your output to a file. If this is not what you want, the next best thing in my opinion would be to store the job and process later then notify/send to user when complete. Hope this helps.

public function export($ids = array())
{
    $headers = array(
        'Content-Type'        => 'text/csv',
        'Cache-Control'       => 'must-revalidate, post-check=0, pre-check=0',
        'Content-Disposition' => 'attachment; filename=test.csv',
        'Expires'             => '0',
        'Pragma'              => 'public',
    );

    $response = new StreamedResponse(function() use($ids){
        // Open output stream
        $handle = fopen('php://output', 'w');

        // Add CSV headers
        fputcsv($handle, [
            "Id",
            "Name",
        ]);

        User::whereIn('id', $ids)           
            ->chunk(500, function($users) use($handle) {
            foreach ($users as $user) {
                // Add a new row with data
                fputcsv($handle, [
                    $user->id,
                    $user->name,
                ]);
            }
        });

        // Close the output stream
        fclose($handle);
    }, 200, $headers);

    return $response->send();
}
Sign up to request clarification or add additional context in comments.

5 Comments

Chunk method was what I was looking for. Thanks.
@Chathushka i have 1 lakh records but its eporting 50k its showing maximum execution error. Please help
@kunal sorry this is an year after, but it must be your php max execution time restriction, can increase from your php config (.ini file)
@Chathushka but that is not a solution and not recommended. do you have any other solution?
@kunal well the write only works within the limits of the execution time quite obviously - that's for stream write. Maybe you can have a look at writing the file to disk and create a link to download and email to user but it's not instantaneous.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.