Say there is an algorithm/function that runs as follows:
function(n)
int x = 1
int y = 1
while( y <= n) {
x++
y = y + x
print("something")
}
If I wanted to use Big-O notation to describe its running time, I would say it would be O(logn).
However, I say this because as a general rule of thumb, if an algorithm skips over some elements in an element set 'n' (as it does in this case), then the algorithm's running time is likely O(logn). How would I go about proving this though? Should I examine the summation of 'y' and its relationship to 'n'? Other approaches? I appreciate any help.