1

What is the time complexity of the problem below.

int j=1;
while(j<n){
  j+=log(j+5);
}
1
  • what's n? is it 0 or 1? easy going. is it greater than 1? well, it depends on it. Commented Nov 11, 2015 at 20:31

2 Answers 2

4

By expanding the first three terms of the sum:

enter image description here

You can see that it's just a sum of iterations of log(log(j))'s. Since O(j) >> O(log(j)), it follows that O(log(j)) >> O(log(log(j)); the first term therefore overshadows all of the other terms.

The sum is therefore O(log(j)), which means the time complexity is

enter image description here.

Numerical tests show that this is actually O(n^0.82...). enter image description here

Sign up to request clarification or add additional context in comments.

7 Comments

I didn't think log() was logarithm, I was thinking it was just some random function the poster didn't define.
@Hardy log() is almost always defined as a natural logarithm in programming languages' math libraries
Ya, I was thinking this was just a general programming question, not a math/comp sci question. I see the tags now though.
@Hardy well this is a comp-sci question, but again log is the standard name (in both contexts) for natural logarithms
Sorry, I miss "+" before "=". And it has been changed now.
|
0

With the edited code (+= instead of =), I'll conjecture that the asymptotic time complexity of this code (assuming that log() and other elementary operations take constant time) is Θ(n / log n).

It's easy to show that the loop takes at least n / log(n+5) = n / (log n × log 5) iterations to complete, since it's counting up to n, and each iteration increments the counter by an amount strictly less than log(n+5). Thus, the asymptotic time complexity is at least Ω(n / log n).

Showing that it's also O(n / log n), and thus Θ(n / log n), seems a bit trickier. Basically, my argument is that, for sufficiently large n, incrementing the counter j from exp(k) up to exp(k+1) takes on the order of C × exp(k) / k iterations (for a constant C ≈ (1 − exp(−1)) / 5, if I'm not mistaken). Letting h = ceil(log(n)), incrementing the counter from 1 to n thus takes at most

        T = C × ( exp(h) / h + exp(h−1) / (h−1) + exp(h−2) / (h−2) + … + exp(1) / 1 )

iterations. For large n (and thus h), the leading exp(h) / h term should dominate the rest, such that TC2 × exp(h) / h for some constant C2, and thus the loop should run in O(exp(h) / h) = O(n / log n) time.

That said, I admit that this is just a proof sketch, and there might be gaps or errors in my argument. In particular, I have not actually determined an upper bound for the constant(?) C2. (C2 = C × h would obviously satisfy the inequality, but would only yield an O(n) upper bound on the runtime; C2 = C / (1 − exp(−1)) would give the desired bound, but is obviously too low.) Thus, I cannot completely rule out the possibility that the actual time complexity might be (very slightly) higher.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.