The documentation for @functools.lru_cache provides an example of computing Fibonacci numbers using a cache to implement a dynamic programming technique:
@lru_cache(maxsize=None)
def fib(n):
if n < 2:
return n
return fib(n-1) + fib(n-2)
I've seen this approach used for solving various programming puzzles. Does it have the same time/space complexity as the 'standard' iterative dynamic programming approach, e.g.:
def fib(n):
if n < 2:
return n
dp = [0]*(n+1)
dp[1] = 1
for i in range(2 , n+1):
dp[i] = dp[i-1] + dp[i-2]
return dp[n]
Also, are there any downsides to using the recursive approach?