3

I'd like to understand why the two prints below produce different results:

f = [lambda x: x**2, lambda x: x+100]
f_new = [lambda x: fi(x) for fi in f]
print( [fi(2) for fi in f] )
print( [fi(2) for fi in f_new] )

The output is:

[4, 102]
[102, 102]
7
  • 1
    Are you under the impression that ^ means "exponentiate" in Python? It doesn't; it's bitwise XOR. Commented Feb 23, 2021 at 21:46
  • @user2357112supportsMonica: Sorry, I copied code from Sage. I've just fixed it to make consistent with pure Python. Commented Feb 23, 2021 at 21:48
  • 1
    While your example is complete, it's not what I'd call minimal. You'd get the same "unexpected" behavior from something as simple as l = [lambda x: fi(x) for fi in f]. I'd recommend editing-out all of the references to itertools since they're distracting from the question you're really asking Commented Feb 23, 2021 at 21:49
  • The itertools documentation has roughly equivalent functions for those two methods - were they any help in understanding how they were working? Commented Feb 23, 2021 at 21:50
  • @Brian: I seem to get it! I'll rewrite the question. Commented Feb 23, 2021 at 21:55

1 Answer 1

2

The two lambdas in f_new actually call the same function.

This is because the list is formed of lambdas that capture the fi variable (at compile time) but do not actually execute the function. So, when the list runs through the generator ... for fi in f the lambdas that are produce all use the captured variable fi and end up with the same function pointer (i.e. the last one in f)

You would need to consume the current value of fi in the comprehension in order to avoid this capture side effect:

f_new = [(lambda fn:lambda x: fn(x))(fi) for fi in f]
print( [fi(2) for fi in f_new] )
[4, 102]
Sign up to request clarification or add additional context in comments.

10 Comments

This explains it! Thank you!
A follow-up question: from this perspective, what is the subtle difference between chain() and chain.from_iterable() producing different results in my original code example? Just in case, the code is available at Sagecell.
from_iterable will finish consuming the first result of for fi in f before going to the next value of fi. This will effectively avoid the captureside effect because the function in the current fi is applied to the range() before fi gets its next value. chain() evaluates all the parameters beforehand so all lambdas are generated before it starts evaluating the ranges which falls in the capture trap.
Thank you! Returning to the current code in question, it seems that the easiest fix is to create f_new as a generator rather than a list: f_new = (lambda x: fi(x) for fi in f)
For the direct execution of the function, it would indeed be enough. If you're going to use the function in other iterators, you'll need to be careful to ensure lazy evaluation throughout.
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.