4

I'm using a foreach loop to process a large set of items, unfortunately it's using alot of memory. (probably because It's doing a copy of the array). Apparently there is a way to save some memory with the following code: $items = &$array;

Isn't it better to use for loops instead?

And is there a way to destroy each item as soon as they have been processed in a foreach loop.

eg.

    $items = &$array;
    foreach($items as $item)
    {
     dosomethingwithmy($item);
     destroy($item);
    }

I'm just looking for the best way to process a lot of items without running out of resources.

4 Answers 4

6

Try a for loop:

$keys = array_keys($array);
for ($i=0, $n=count($keys); $i<$n; ++$i) {
    $item = &$array[$keys[$i]];
    dosomethingwithmy($item);
    destroy($item);
}
Sign up to request clarification or add additional context in comments.

6 Comments

please put the count outside the for. you will gain some performance
my script is using 50% less memory with this loop.
@mnml: That was predictable since foreach uses an internal copy of the array and thus doubles the memory usage for that array.
An internal copy in PHP doesn't mean double memory necessarily.
isn't it pre-increment of $i (++$i). You will miss 0th index of $keys
|
5

Resource-wise, your code will be more efficient if you use a for loop, instead of a foreach loop. Each iteration of your foreach loop will copy the current element in memory, which will take time and memory. Using for and accessing the current item with an index is a bit better and faster.

Comments

1

use this:

reset($array);
while(list($key_d, $val_d) = each($array)){

}

because foreach create a copy

Comments

1

If you are getting that large data set from a database, it can often help to try and consume the data set as soon as it comes from the database. For example from the php mysql_fetch_array documentation.

$resource = mysql_query("query");
while ($row = mysql_fetch_array($resource, MYSQL_NUM)) {
    process($row);
}

this loop will not create an in memory copy of the entire dataset (at least not redundantly). A friend of mine sped up some of her query processing by 10x using this technique (her datasets are biological so they can get quite large).

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.