If you are running PHP on a webserver (and multi_curl may be unavailable), one way (without libraries) to make it run scripts in parallel is to open sockets to localhost:80 and manually make the webserver run the scripts you want. They will run in parallel using the server multithreading. Then in a loop you collect all the results, and when all of them are done (or after a timeout of your choice) you go on.
This is a piece of code taken from a script that retrieves sizes of all images referenced on a webpage..
The get_img_size.php script retrieves the size and info of one image.
$sockets[] is an array that keeps one socket for every image to test.
foreach($metaItems['items'] as $uCnt=>$uVal) {
$metaItem=ContentLoader::splitOneNew($metaItems,$uCnt);
$AnImage=$metaItem['url'];
$sockets[$AnImage] = fsockopen($_SERVER['HTTP_HOST'], 80, $errno, $errstr, 30);
if(!$sockets[$AnImage]) {
echo "$errstr ($errno)<br />\n";
} else {
$pathToRetriever=dirname($_SERVER['PHP_SELF']).'/tools/get_img_size.php?url='.rawurlencode($AnImage);
// echo('<div>META Retrieving '.$pathToRetriever.' on server '.$_SERVER['HTTP_HOST'].'</div>');
$out = "GET $pathToRetriever HTTP/1.1\r\n";
$out .= "Host: ".$_SERVER['HTTP_HOST']."\r\n";
$out .= "Connection: Close\r\n\r\n";
// echo($out);
fwrite($sockets[$AnImage], $out);
fflush($sockets[$AnImage]);
// echo("<div>Socket open for $AnImage...</div>");
// flush();
}
}
} else $FoundImagePaths2[]=$metaItems; // ALL of them urls belongs to us
After this you can do your own business while the "threads" go on working, then, in a loop, you go on reading from all $sockets[] and testing for EOF. In the example, much later in the code (a loop for each $AnImage):
if(isset($sockets[$AnImage])) {
if(feof($sockets[$AnImage])) {
if(!isset($sizes[$AnImage])) $sizes[$AnImage]='';
$sizes[$AnImage].=fgets($sockets[$AnImage], 4096);
// echo("<div>HTML $AnImage DONE.</div>");
// echo("<div>[ ".$sizes[$AnImage]." ]</div>");
// flush();
fclose($sockets[$AnImage]);
unset($sockets[$AnImage]);
$mysizes=ContentLoader::cleanResponse($sizes[$AnImage]);
// echo($sizes[$AnImage]." ");
// echo(ContentLoader::cleanResponse($sizes[$AnImage]));
if(!is_array($mysizes)) {continue;}
if($mysizes[0]>64 && $mysizes[1]>64 && ($mysizes[0]>128 || $mysizes[1]>128))
$FoundImagePaths2[]=array('kind'=>'image','url'=>$AnImage,'ext'=>$ext,'width'=>$mysizes[0],'height'=>$mysizes[1],'mime'=>$mysizes['mime']);
It is not efficient in terms of memory and processes and speed-wise, but if a single image takes a few seconds, the whole page with 20+ images takes the same few seconds to test them all. It's somehow parallel PHP, after all.