1

I have an algorithm that runs in O(m) time. This algorithm takes in a set of data containing m entries.

The data is generated randomly by specifying a strictly positive integer input n. The number of entries generated is O(n log n).

Edit

Alone, the time complexity of generating the data is independent of n (or O(1)), which means given the integer n, the entries are instantly and randomly generated. The number of resulting entries is random, but is O(n log n). E.g. n = 10, then number of entries generated is some constant times 10 (log 10).

The data is generate before hand. Then the resulting m entries is fed into the algorithms as input.

Question

Can I then assume that the algorithm runs in O(n log n) time?

2
  • assuming n log n = m, then why not? Commented Dec 3, 2012 at 18:24
  • Traditionally speaking, the n in the big O notation is indicating the size of the input - so if your algorithm is linear in the size of the input it is O(n). However, this is often relaxed (a common example is graphs where we use O(E+V), and not n as the size of the input, but some properties of this input). Commented Dec 3, 2012 at 18:28

1 Answer 1

2

There are some ambiguities in your question that were either deliberately place to help you internalize the relationship between input size and run time complexity, or simple caused by miscommunication.

So as best as I can interpret this scenario:


Your algorithm complexity O(m) is linear with respect to m.

So since We assume that generating the data is independent of input. i.e. O(1)., your time-complexity is only dependent on some n that you specify that generates entries.

So yes, you can say that the algorithm runs in O(n log n) time, since it doesn't do anything with the input of size m.


In response to your updated question:

It's still hard to follow because some key words refer to different things. But in general I think this is what you are getting at:

  • You have a data set as input, that is size O(n log n), given some specific n.
  • This data set is used as input only, it's either pre-generated, or generated using some blackbox that runs in O(1) time regardless of what n is given to the blackbox. (We aren't interested in the blackbox for this question)
  • This data set is then fed to the algorithm that we are actually interested in analyzing.
  • The algorithm has time-complexity O(m), for an input of size m.
  • Since your input has size O(n log n) with respect to n, then by extension your O(m) linear-time algorithm has time complexity O(n log n), with respect to n.

To see the difference: Suppose your algorithm wasn't linear but rather quadratic O(m^2), then it would have time-complexity O(n^2 log^2 n) with respect to n.

Sign up to request clarification or add additional context in comments.

4 Comments

IMHO there is a possible contradiction in the question. The number of entries generated is O(n log n). We assume that generating the data is independent of input. i.e. O(1).
@axiom yeah, that part bothered me; the only way I saw to resolve the inconsistency is if the algorithm just discards the size m input and just generates random entries based on the integer input n.
Thanks for the answer +1. Sorry, it was my mistake. I edited the question to hopefully fix the ambiguity. Will accept your answer in a while.
Thanks a lot for updating your answer. That is exactly what I was trying to say.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.