4

I'm busy doing an assignment and I'm struggling with a question. I know I'm not supposed to ask assignment questions outright so I understand if I don't get straight answers. But here goes anyway.

We must calculate the run time complexity of different algorithms, the one I'm stuck on is this.

for(int i = 1 ; i < n ; i++)
    for(int j = 0 ; j < i ; j +=2)
        sum++;

Now with my understanding, my first thought would be less than O(n2), because the nested loop isn't running the full n times, and still the j variable is incrementing by 2 each loop rather than iterating like a normal for loop. Although, when I did some code simulations with N=10, N=100, N=1000, etc. I got the following results when I outputted the sum variable.

N = 10 : 25, 
N = 100 : 2500,
N = 1000 : 250000,
N = 10000 : 25000000

When I look at these results, the O Notations seems like it should be much larger than just O(n).

The 4 options we have been given in the assignment are : O(1), O(n2), O(n) and O(logn). As I said earlier, I cannot see how it can be as large as O(n2), but the results are pointing to that. So I just think I don't fully understand this, or I'm missing some link.

Any help would be appreciated!

6
  • "the j variable is doubling each loop". I think you're misreading it. j*=2 would be doubling; j+=2 is incrementing by two. Commented Feb 21, 2014 at 14:02
  • Yes sorry, that's a mistype :) I'll edit that out now. Commented Feb 21, 2014 at 14:02
  • Well it is between O(n) and O(n^2) so I think it defaults to the larger one. Commented Feb 21, 2014 at 14:04
  • 3
    Hint: What's the sum 1 + 2 + ... + n? How does omitting every other term affect this? Commented Feb 21, 2014 at 14:04
  • 3
    @Chemistpp 3n * ln n is also between linear and quadratic, and even though it's technically in O(n^2), that's not a tight bound (which is what we're usually talking about). It's on O(n log n). Commented Feb 21, 2014 at 14:05

6 Answers 6

11

Big O notation does not give you the number of operations. It just tells you how fast it will grow with growing input. And this is what you observe.

When you increased input c times, the total number of operations grows c^2.

If you calculated (nearly) exact number of operations precisely you would get (n^2)/4.

Of course you can calculate it with sums, but since I dunno how to use math on SO I will give an "empirical" explanation. Simple loop-within-a-loop with the same start and end conditions gives n^2. Such loop produces a matrix of all possible combinations for "i" and "j". So if start is 1 and end is N in both cases you get N*N combinations (or iterations effectively).

However, yours inner loop is for i < j. This basically makes a triangle out of this square, that is the 1st 0.5 factor, and then you skip every other element, this is another 0.5 factor; multiplied you get 1/4.

And O(0.25 * n^2) = O(n^2). Sometimes people like to leave the factor in there because it lets you compare two algorithms with the same complexity. But it does not change the ratio of growth in respect to n.

Sign up to request clarification or add additional context in comments.

Comments

2

Bear in mind that big-O is asymptotic notation. Constants (additive or multiplicative) have zero impact on it.

So, the outer loop runs n times, and on the ith time, the inner loop runs i / 2 times. If it weren't for the / 2 part, it would be the sum of all numbers 1 .. n, which is the well known n * (n + 1) / 2. That expands to a * n^2 + b * n + c for a non-zero a, so it's O(n^2).

Instead of summing n numbers, we're summing n / 2 numbers. But that's still somewhere around (n/2) * ((n/2) + 1) / 2. Which still expands to d * n^2 + e * n + f for a non-zero d, so it's still O(n^2).

Comments

0

The thing is that number of operations here is dependant on the square of n, even though the overall number is less than n². Nevertheless, the scaling is what matters for Big-O notation, thus it's O(n²)

2 Comments

Here "dependent on" only means "proportional to". The actual number is also dependent on n (by the function f(x)=cx² for some c), but it's not in O(n).
This made sense to me, thank you. I keep figuring that it should be exact and that the Big-O should be the "minimum" in a sense.
0

With:

for (int i = 1 ; i < n ; i++)
    for (int j = 0 ; j < i ; j +=2)
        sum++;

We have:

0+2+4+6+...+2N == 2 * (0+1+2+3+...+N) == 2 * (N * (N+1) / 2) == N * (N+1)

so, with n == 2N, we have (n / 2) * (n / 2 + 1) ~= (n * n) / 4

so O(n²)

Comments

0

Your understanding regarding time complexity is not appropriate.Time Complexity is not only the matter of 'sum' variable.'sum' only calculates how many times the inner loop iterates,but you also have to consider total number of outer loop iterations.

now consider your program:

 for(int i = 1 ; i < n ; i++)
    for(int j = 0 ; j < i ; j +=2)
        sum++;

Time complexity means running time of your program with respect to input values(here n).Here running time does not mean actual required time to execute your program in your computer .Actual required time varies from machine to machine.so to get a machine independent running time, Big O notation is very useful.Bog O actually comes from mathematics and it describes the running time in terms of mathematical functions.

The outer loop is executed total (n-1) times.for each of these (n-1) values (starting from i=1), the inner loop iterates i/2 times.so total number of inner loop iterations=1+1+2+2+3+3+...+(n/2)+(n/2)=2(1+2+3+...+n/2)=2*(n/2(n/2+1))/2=n^2/4+n/2. similarly 'sum++' also executed total n^2/4+n/2 times.Now consider cost of line 1 of the program=c1,cost of line 2=c2 and cost of line 3=c3 .These casts can be different for different machine. so total time required for executing the program =c1*(n-1)+c2*(n^2/4+n/2)+c3*(n^2/4+n/2)=(c2+c3)n^2/4+(c2+c3)n/2+c1*n-c1.Thus the required time can be expressed in terms of mathematical function.In Big O notation you can say it is O((c2+c3)n^2/4+(c2+c3)n/2+c1*n-c1).In case of Big O notation, lower order terms and coefficient of highest order term can be ignored. because for large value of n ,n^2 is much greater than n. so you can say it is O((c1+c2)n^2/4).Also as for any value of n , n^2 is greater than (c1+c2)n^2/4 by a constant factor, so you can say it is O(n^2).

Comments

0

From your output it seems like: sum ~= (n^2)/4.

This is obviously O(n^2) (actually you can replace the O with theta). You should recall the definition for Big-O notation. See http://en.wikipedia.org/wiki/Big_O_notation.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.