This is a homework question, so I would like to avoid complete answers and much prefer hints if possible.
Given an array of random integers A[1...x], the program should return the first y array elements in incrementing order where 1<=y<=sqrt(x). So, basically, given an array [5,9,2,8] and y=2, the program should return [2,5].
The "sort first, return first y items" answer is out the window since the best we can do is n*logn time with merge or quicksort. The answer thus has to make use of the fact that we only return at most sqrt(x) items, and the only other answer I have so far is doing a for-loop search for the minimum element in the array, removing the minimum from the array, stashing it in a new array, say B, and repeating the process on the now smaller modified version of A of length x-1, which gives us running time like so:
x + (x-1) + (x-2) + ... + (x-y)
That counts the number of for-loop iteration of the min-search, and gives us at most y or sqrt(x) iterations in the worst case scenario, and there are at most x items in the array. So we have sqrt(x)*x, which is better than O(n*logn) but still not quite O(n) :/.