i basically want to do exactly something like this: Simple Html DOM Caching
i got everything to work so far, but now i'm getting the following error because i scrape many sites (6 at the moment, i want up to 25 sites):
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 39 bytes)
i'm a php newbie =/...so, how can i "serialize" the scraping process step by step that my memory don't give up? :-)
code sample:
// Include the library
include('simple_html_dom.php');
// retrieve and find contents
$html0 = file_get_html('http://www.site.com/');
foreach($html0->find('#id') as $aktuelle_spiele);
file_put_contents("cache/cache0.html",$aktuelle_spiele);
thank you very much in advance for your help!