0

I have to determine the runtime complexity of the following algorithm (see picture). I know how to do so theoretically, but for this specific example I'm having a hard time and also have no sample solution to see if I'm corrrect.

If someone could review my answer and correct it and maybe even give a explanation of the correct solution, that would be greatly appreciated.

    Overall --> Θ(n^2)
    for(int i=42; i<n*Math.log(n); i++)Θ(n)  
        foo(i);                        Θ(n*log2(n))
        for(int j=1; j<4711; j=j*3)    Θ(n)
            bar(i*j);                  Θ(log2(n))

foo(int n)                             --> Θ(n*log2(n))
    if(n%2 == 0)                       Θ(1)
        for(int i=1; i<2*n; i=i+2)     Θ(n)
            bar(i);                    Θ(log2(n))
    else                               --> Θ(n*log2(n))
        for(int i=1; i<n; i++)         Θ(n)
            bar(n);                    Θ(log2(n))
bar(int m)                              --> Θ(log2(n))
     while(m>0)                         Θ(log2(n))
         m = m/4;                       Θ(1)
3
  • @user1984 Sorry for the unfinished post, i needed a bit to embed the code and add my comments. It should be better now! Commented Jan 31, 2022 at 18:15
  • @Muhteva it is supposed to be bar(i) for the first loop and bar(n) for the second loop Commented Jan 31, 2022 at 18:16
  • That's great @DRock99 thank you for improving it :D Commented Jan 31, 2022 at 18:19

1 Answer 1

1
  1. Time complexity of bar(int m) is O(log(m)). It's actually log_4(m), however it can be converted to log base 2 like this: log_4(m) = log_2(m) / log_2(4). And log_2(4) is constant. Therefore it's O(log(m)). Your analysis is correct.

  2. Time complexity of foo(int n) is calculated like this: In first loop, we start from 1 and go to 2n two by two. It takes n iterations. In second loop, we start from 1 and go to n one by one. It takes n iterations. Complexity of bar(n) is O(log(n)). However complexity of bar(i) depends on i, not n. It's O(log(i)). So total number of operations can be calculated like this:
    If n is even, then number of total operations:
    log(1) + log(3) + ... + log(2n) = A
    Now, we know that A <= nlog(2n) = O(nlogn).
    We also know that A >= log(1) + log(2)+ ... + log(n) = log(n!) = nlogn = O(nlogn). (*)
    Therefore we squeezed A between O(nlogn) and O(nlogn), therefore time complexity for A is O(nlogn).
    If n is odd, then it is:
    log(n) + log(n) + ... + log(n) = nlog(n) Your time complexity analysis for the second part is completely correct.

  3. We call foo(i) approximately nlog(n) times. Then we call bar(i*j) 8 times. Notice that as n increases, this value doesn't change. Therefore we can ignore j since it's not really a variable based on input size. Time complexity of bar(i*j) is proportional to log(i).
    So overall, if i is even. Number of operations:
    ilog(i) + 8 * log(i).
    It's the same when i is odd. In total:
    Sum of ilog(i) from i=42 to i=nlog(n)
    42log(42) + 43log(43) + ... + nlog(n)log(nlog(n)) <= nlog(n)*nlog(n)*log(nlog(n)) = O(n^2 * log(n)^3))
    Time complexity of the whole function is O(n^2 * log(n)^3).

(*) If you wonder how we switched from log(n!) to nlog(n). Refer to Is log(n!) = Θ(n·log(n))?

Sign up to request clarification or add additional context in comments.

2 Comments

Awesome, thank you, i do actually understand it now!
It doesn't call bar(i*j) 4711/3 times; it calls it 8 times ( j *= 3, not j += 3). But that's a detail.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.