0

I've understood how to compute the time and space complexity of algorithms, and I found an interesting problem for training.

// count the occurrence of an element in an array
// list is already sorted
function count (list, item, start, end) {
    if (start > end) {
        return 0;
    }
    const mid = Math.floor((start + end) / 2);
    if (list[mid] < item) {
        return count(list, item, mid + 1, end);
    }
    if (list[mid] > item) {
        return count(list, item, start, mid - 1);
    }
    return count(list, item, start, mid - 1) + 1 + count(list, item, mid + 1, end);
}

The answer is O(n). I thought it should be something with a logarithm. Can somebody explain to me, why I'm wrong?

2
  • 1
    Hint: consider the worst case. For which input would this method take longest? Commented Oct 11, 2020 at 11:30
  • Oh, exactly! If each element in array will be the same, then time complexity will be O(n). Thank you very much! Commented Oct 11, 2020 at 11:38

2 Answers 2

1

Consider the case where all elements of the list are equal to item. In this case you will only use first and last return. So you can write runtime of your algorithm for n = end - start as roughly:

T(n) = O(1) + 2 * T(n/2)

Which solves to T(n) = O(n).

Sign up to request clarification or add additional context in comments.

2 Comments

I'm right that average time complexity will be O(log n)?
@Sam, might be, depends on domain/distribution of inputs.
1

If we have n items all equal to the target, and we're counting them one by one:

... + 1 + ...

that's O(n) counts/calls at least.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.