I've been trying to compare the runtimes of selection sort, quick sort and merge sort and so far, what I got is this table:
n S(n) S(n))/n^2 Q(n) (Q(n))/(nlog2 n) M(n) M(n))/(nlog2n)
1000 1.4 1.40000E-06 0.104 1.04357E-05 0.151 1.51518E-05
2000 5.41 1.35250E-06 0.223 1.01680E-05 0.327 1.49100E-05
4000 21.23 1.32688E-06 0.486 1.01540E-05 0.704 1.47086E-05
8000 84.15 1.31484E-06 1.036 9.98783E-06 1.496 1.44226E-05
16000 336.47 1.31434E-06 2.179 9.75151E-06 3.177 1.42178E-05
32000 1322.99 1.29198E-06 4.673 9.75767E-06 6.709 1.40090E-05
n is the size of the array being sorted.
I know that S(n) is the total runtime of the sorting program. I just don't get why I'm being asked to divide this total, by its order (i.e S(n) is O(n^2)). And what these values represent or what information can I infer from these results (3rd, 5th, & 7th column)?
If anyone can briefly explain this to me, that would be greatly appreciated.