Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wrote a similar sorting algorithm around 14 years ago (fast, inplace, adaptive merge sort algorithm using rotation/swap, and I think, stable). This is all I remember from the time...

I was able to prove (on paper) that the whole algorithm was actually O(n log n). The moves seems to have a bad recursion pattern, but assuming my proof was right, it is possible to have a well behaved implementation in practice. However, the recursion pattern didn't fit the master theorem, I remember having to do a taylor expansion of some formula giving an upper bound on the number of computation steps to prove the O(n log n) bound.

And runtime measurements graph were showing the expected O(n log n) curve, as in your cases. So maybe they are not O(n log n ^ 2) as you feared... I did not keep much information from that time :P.



Correction, technically, it uses O(log n) space for the recursion, so it is not stricly-speaking inplace, which would mean O(1) space.


It's possible to implement a 64 item stack so you can call it O(1) and in-place.

But that's such an effort in futility that I refuse to do so. So it's theoretically and practically in-place, but not technically.


Of course, this is nitpicking :). For all practical purpose, this O(log n) is O(1).

If you are interested, I can try to recover the proof that the rotation step can be done in O(n), thus allowing to apply the master theorem on the main recursion and getting the O(n log n) result.


I'm not that interested as I'd prefer to count each move and prove it that way, but perhaps Peter (orlp) is interested?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: