3

I'm given code for an algorithm as such:

1  sum =0;
2    for(i=0; i<n; i++)
3      for(j=1; j<= n; j*=3)
4        sum++;

I'm told this algorithm runs in O(nlogn) where log is in base 3. So I get that the second line runs n times, and since line 3 is independent of line 2 I would have to multiply the two to get the Big O, however, I'm not sure how the answer is nlogn(log in base 3), is their a guaranteed way to figure this out everytime? It seems like with nested loops, there's a different case that can occur each time.

2
  • The O() runtime of the second loop, on its own, runs exactly (1/3) * n times, right? So this is better than O(n), but obviously constant, so we use log as an upper bound because it is in between. Commented Dec 6, 2015 at 3:02
  • 1
    with time complexity the base of the logarithm is of no relevance, thus it is eluded and simply O(log n) Commented Dec 6, 2015 at 3:22

3 Answers 3

4

What you have been told is correct. The algorithm runs in O(nlog(n)). The third line: for(j=1; j<= n; j*=3) runs in O(log3(n)) because you multiply by 3 each time. To see it more clearly solve the problem: how many times you need to multiply 1 by 3 to get n. Or 3^x = n. The solution is x = log3(n)

Sign up to request clarification or add additional context in comments.

3 Comments

Please stop writing constants inside Big O notation, it describes the speed of function's growth in relation to input parameter(s), not estimated execution time.
@user3707125 where do you see constants? I wrote about log3 because the OP explicitly asked where log3 came from.
here: "runs in O(log3(n))", it would be more correct to say "runs approximately log3(n) times, which results in asymptotic complexity of O(log(n))".
1

Yes. Algo runs in nlog(n) times where log is base 3.

The easiest way to calculate complexity is to calculate number of operations done.

The outer for loop runs n times. And let's calculate how many times each inner for loop runs for each n. So for n=1,2, inner for loop runs 1 times. For n=3,4,5,6,7,8 inner for loop runs 2 times. And so on...

That means that the inner for loop runs in logarithmic time (log(n)) for each n.

So n*log(n) will be total complexity.

Comments

0

On the second loop you have j *= 3, that means you can divide n by 3 log3(n) times. That gives you the O(logn) complexity.
Since your first loop has a O(n) you have O(nlogn) in the end

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.