1

Given a list, say 'x' is in a length of n, what's the time complexity of the following algorithm?

def foo(x):
  n = len(x)
  if n <= 1:
     return 17
  return foo(x[:n//2]) + foo(x[n//2:])

The answer is:

O(n log n)

But I can't figure out why? I struggle to figure the last row where we use recursion, I know that it's cutting the length of the list in half each time so its O(log n), but it add's to each iteration the other recursion which is also O(log n) each time so I though its O(log n log n) but unfortunately its not.

2
  • can you detail how much you've figured out from the code as it's unclear which specific part you don't understand Commented Jun 28, 2018 at 10:44
  • @EdChum of course I edited the post. Commented Jun 28, 2018 at 10:54

4 Answers 4

2

You are correct in identifying that it's O(log n), but you fail to identify what it is. it is the number of steps it takes to reach the base case. Since each time you are cutting the list in half, each time you call foo, you are working with a list which is half the size of the one you just had. Therefore, it takes O(log n) steps to reach the base case.

The next question is: how much work is done at each step? In the first step, the list is broken in half, which requires n memory copies. In the second step, two lists of size n/2 are broken in half. The amount of work done remains the same! From one step to the next, the size of each list you are cutting halves (due to calling foo(n//2)), but the number of lists you must do this for doubles (since you are calling foo twice recursively). Therefore, for each step, you are always doing O(n) work.

O(log n) steps * O(n) work at each step = O(n log n) in total.

Sign up to request clarification or add additional context in comments.

Comments

0

This is similar to merge sort. Here you take O(n) time to slice the array as seen here and then you do operations on both halves of the list. Time complexity of merge sort is O(n log(n)).

If you want derivation of merge sort you can take a look at this

Comments

0

This algorithm returns once for each element in the list O(n) It then also implements a bisection search O(log n) Therefore, it's O(n log n)

But I can't figure out why? I struggle to figure the last row where we use recursion, I know that it's cutting the length of the list in half each time so its O(log n), but it add's to each iteration the other recursion which is also O(log n) each time so I though its O(log n log n) but unfortunately its not.

O(log n) + O(log n) = O(log n)

Adding a print statement or two to this should help a lot:

def foo(x):
    n = len(x)
    print(x)
    if n <= 1:
        print("return")    
        return 17
    return foo(x[:n//2]) + foo(x[n//2:])

>>> foo([1,2,3,4,5,6,7,8])
[1, 2, 3, 4, 5, 6, 7, 8]
[1, 2, 3, 4]
[1, 2]
[1]
Return
[2]
Return
[3, 4]
[3]
Return
[4]
Return
[5, 6, 7, 8]
[5, 6]
[5]
Return
[6]
Return
[7, 8]
[7]
Return
[8]
Return

This is obviously returning once for each element in the list, which makes it at least O(n). Additionally, to split the list in a bisection search type approach takes O(log n)

Comments

0
def foo(x):
   n = len(x) # outer
   if n <= 1: # inner
     return 17 # inner
   return foo(x[:n//2]) + foo(x[n//2:]) #outer

we can split the function into 2 parts. The first, outer part can be defined with "n=len(x)" and "return foo(x[:n//2]) + foo(x[n//2:])" in which "x" is divided into 2 recursively. Thus, the outer function was log n. In the second, inner part is composed of "if n <= 1: \ return 17" where n is searched with "n <= 1". therefore inner part function is just n. The inner part x the outer part give us "n.log n " as a result.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.