1

i basically want to do exactly something like this: Simple Html DOM Caching

i got everything to work so far, but now i'm getting the following error because i scrape many sites (6 at the moment, i want up to 25 sites):

Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 39 bytes)

i'm a php newbie =/...so, how can i "serialize" the scraping process step by step that my memory don't give up? :-)

code sample:

// Include the library
include('simple_html_dom.php');

// retrieve and find contents
$html0 = file_get_html('http://www.site.com/');
foreach($html0->find('#id') as $aktuelle_spiele);

file_put_contents("cache/cache0.html",$aktuelle_spiele);

thank you very much in advance for your help!

1
  • Up teh memory limit....it seems you have only 32 mb of memory limit Commented May 17, 2013 at 15:59

2 Answers 2

3

In your php.ini, change this line:

memory_limit = 32M

With this one:

memory_limit = 256M //or another greater value

Or add this piece of code at the start of every php script that use simple_html_dom:

ini_set('memory_limit', '128M'); //or a greater value
Sign up to request clarification or add additional context in comments.

4 Comments

@ThomasVeit, you can run a memory increase at the start of your script. Like this: ini_set('memory_limit', '128M')
mhhh, my hoster does not allow more than 32M memory... someone has another idea to reduce my script memory?
I thought you have your own server...you can add the line proposed by @Luigi and the start of every script that use simple html dom php
@ThomasVeit. I have pasted my comment as an answer. Sorry Robert. +1
0

you can run a memory increase at the start of your script.

Like this:

ini_set('memory_limit', '128M');

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.