i am really sorry for making offtopic, hope you will not kill me, but
this is for me life important problem which needs to be solved within
next 12 hours..
I have to create stable algorithm for sorting n numbers from interval
[1,n^2] with time complexity O(n) .
Can someone please give me a hint. Would be very very thankful!
Thanks in advance!
D.
Count sort?
http://en.wikipedia.org/wiki/Counting_sort
It's time complexity is O(n), but space complexity in your case is O(n^2).
--
Aram Hăvărneanu
I would say it's not O(n). It has to iterate the count array, so it's
O(n^2) in this case. Though it depends a little how you measure.
However, since we are just sorting numbers, I'd say an access to the
count array should be counted equally to an access to the array being
sorted.
Not that I have an alternative proposal at this stage...
Ben.
I've never seen an algorithm for sorting that's better than O(n log n)
on average. Are there any? I can't even imagine how a sort could
possibly be done in O(n) time, since that would require looking at
each element only once...
That being said, if I had to quickly write a program that had a speedy
stable sort, I'd either write it in C++ and use std::stable_sort() (if
I weren't a student doing this for a class) or try my hand at writing
a merge sort that makes decent use of locality of reference to be fast
enough (if I knew a professor would fail me for using
std::stable_sort()).
~Matt
All the algorithms I've seen for sorting in less than O(n log n)
have required knowledge of your data-set's features. The O(n log
n) family of sorts involve comparisons, where this is the lower
bound. Using methods other than comparison for sorting is where
you jump to O(n) with your counting/radix/bucket sorts. This
usually involves knowing the upper/lower bounds of the data, the
number of items to be sorted, and/or the distribution within them.
My copy of _Introduction to Algorithms_ (by Cormen, Leiserson,
and Rivest) has an entire chapter (#9) on "Sorting in linear
time" that introduces all three of the aforementioned algorithms
with sample pseudo-code implementations for each.
The counting sort uses the actual elements as indicies, such as a
sorting a shuffled deck of cards:
destination = Card[4][13]
for card in shuffled(cards):
dest[card.suit][card.rank] = card
Radix & bucket sorts are a bit more complex, but aren't bad. I
believe the radix sort is O(N * D) (where N="number of items" and
D="number of digits/length-of-string"), and the bucket sort is
O(N * C) where N="number of items" and C="degree of
collision/commonality" (so if you have a lot of data that has the
same keys for your buckets, you'll actually get *worse*
performance due to the collisions). They both presume that D/C
are fairly small and constant for your data-set.
-tim
I heard a lot of good things about bubble sort:
http://www.youtube.com/watch?v=k4RRi_ntQc8
With a really fast computer, it should sort n^2 elements
in O(n) complexity (or the other way around, whatever).
Now if you have a quantum computer around you, try
the quantum bogosort O(n) algorithm:
http://en.wikipedia.org/wiki/Bogosort
To avoid being off-topic, check also ":help sort" and
good luck with your homework!
-- Dominique
An incorrect assumption about how all sorting algorithms must work, apparently!
> All the algorithms I've seen for sorting in less than O(n log n)
> have required knowledge of your data-set's features. The O(n log
> n) family of sorts involve comparisons, where this is the lower
> bound. Using methods other than comparison for sorting is where
> you jump to O(n) with your counting/radix/bucket sorts. This
> usually involves knowing the upper/lower bounds of the data, the
> number of items to be sorted, and/or the distribution within them.
>
> My copy of _Introduction to Algorithms_ (by Cormen, Leiserson,
> and Rivest) has an entire chapter (#9) on "Sorting in linear
> time" that introduces all three of the aforementioned algorithms
> with sample pseudo-code implementations for each.
Nice. I might have to get a copy of a real algorithms book, clearly
my class books left me a bit short uninformed about sorting
algorithms. :)
~Matt
Not in general, but in specific cases. In this case, the counting sort
complexity is O(n^2) because the numbers are from the set [1,n^2] If we
were sorting n numbers from [1,n] then the complexity would be O(n).
> I can't even imagine how a sort could
> possibly be done in O(n) time, since that would require looking at
> each element only once...
Well, it would require looking at each element only a constant number of
times.
Ben.
...or even a bounded number of times (i.e., the number of times might be
variable, but it admits a fixed upper bound independent of n).
Best regards,
Tony.
--
No violence, gentlemen -- no violence, I beg of you! Consider
the furniture!
-- Sherlock Holmes
That's what the word 'only' above was supposed to convey. Good
clarification, though.
Ben.