0

I'm trying save some files using cURL. I have managed to achieve this for a single url, but when I'm creating an array to keep multiple url's in it, then it is not working. I'm not getting any errors either even after turning on error reporting. The files are simply not being saved. My code in given below. Any help is much appreciated.

<?php
error_reporting(E_ALL);
$var = "http://www.example.com/trailer/";

$urls = array("1423/10_Things_I_Hate_About_You/","1470/10.5__Apocalypse/");

$ch = curl_init();

foreach ($urls as $i => $url) {

$conn[$i]=curl_init($url);
curl_setopt($conn[$i], CURLOPT_URL, $url .$var);
curl_setopt($conn[$i], CURLOPT_RETURNTRANSFER, 1); 
curl_setopt($conn[$i], CURLOPT_HEADER, 0);

// grab URL and pass it to the browser
$out = curl_exec($conn[$i]);

// close cURL resource, and free up system resources
curl_close($conn[$i]);

$fp = fopen($url . "index.html", 'w');
fwrite($fp, $out);
fclose($fp);
}
2
  • your concatenation is backwards should be $var.$url Commented Sep 8, 2013 at 18:38
  • yes!! you caught a issue Commented Sep 8, 2013 at 18:52

1 Answer 1

1

How does file_get_contents and file_put_contents work for you? You could combine the following into a single line.

<?php
    $homepage = file_get_contents( 'http://www.example.com/' );
    file_put_contents( '/my/filename.html', $homepage );
?>

Alternatively, I wrote a curl wrapper class. Here is some sample usage:

https://github.com/homer6/altumo/blob/master/source/php/Http/OutgoingHttpRequest.md

$http_response =  $client->sendAndGetResponseMessage( true );

is the operative line.

Sign up to request clarification or add additional context in comments.

4 Comments

Thanks for you answer!!! i never thought of it so the answer will be <?php $var = "example.com"; $urls = array("1997/12_Angry_Men/","2091/127_Hours/"); foreach($urls as $url) { $homepage = file_get_contents($var . $url); file_put_contents( $url . "index.html", $homepage ); } ?>
hi can you please tell how can i get result when each foreach is done? cause the array has more than 10000 url so i wanted to know which one is done so far. The reason is many times due to break in operation it stops downloading so if i can know how much in the loop has completed.
I would log what you want to track (or both). You could read the urls from a "to-do list" where each line has a url. Then, as each completes successfully, it removes that line from the "to-do list" and adds it to the "completed" list (which is a separate text file).
You can add an error handler function to log the ones that failed as well in an "error log" (which is, again, just a text file with one url per line). Then, it could log that error and continue on. In the end, your "to-do list" will be empty, but the "completed" and "error" lists will have multiple entries. See php.net/manual/en/function.set-error-handler.php

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.