I have come up with an sorting algorithm that looks like $O(n \log \log n)$. Could anyone help to find if it is already commonly known or worth anything?
The time complexity seems to be: $T(n) = \sqrt{n} \cdot T(\sqrt{n}) + O(n)$, which would lead to: $O(n \log \log n)$.
This algorithm divides into $\sqrt{n}$ pieces in $O(n)$ while Merge Sort takes $O(1)$ and Quick Sort takes $O(n)$ to both divide into 2. This algorithm takes $O(\sqrt{n})$ to merge while Merge Sort takes $O(n)$ and Quick Sort takes $O(1)$.
Pseudocode for p_sort
percentile_sort(input_array):
if size(input_array) <= 1 then
return input_array
else if size(input_array) <= 2 then
if input_array[1] < input_array[0] then
input_array[0], input_array[1] = input_array[1], input_array[0]
min_value = min(input_array) -- O(1)
max_value = max(input_array) -- O(1)
if min_value == max_value then
return input_array
num_buckets = max(2, square_root(size(input_array))) -- O(1)
foreach value in input_array -- loops n times, time complexity O(n)
bucket_index = integer((value - min_value)/(max_value - min_value) * num_buckets) -- O(1)
append value to buckets[bucket_index]
foreach bucket in buckets -- loop square_root(n) times, time complexity T(square_root(n))
call percentile_sort(bucket) -- recursive call
append return vector to sorted_buckets
return sorted_buckets
Here is my implementation and more details.
minandmaxare both $O(n)$ operations, and calculatingsquare_root(n)is at best $O(log(n))$. $\endgroup$min(input_array)of an array is O(1)? $\endgroup$list.sort(). Perhaps the quick sort you are comparing against is not very robust? It would also be good to see averaging over multiple runs to discount things like CPU cache affecting performance etc $\endgroup$