- Oct 14, 1999
- 4,375
- 0
- 0
Yeah, there is certainly more to efficiency than reduced levels of recursion. My sorting algorithm is slightly less complex than the merge sort example I saw, so when working with small sets of data, my sort may be more efficient even if goes a few levels deeper. I also avoided wasting lots of memory by passing my vectors by reference with the start and end positions, rather than making two smaller vectors for the left and the right and passing those.
I could cut a few levels of recursion by adding in a test to see if it's already sorted (so it wouldn't go ALL the way down - if everything was sorted at the 60th recursive call, it would terminate, instead of at the 66th)...the problem with that is then I am performing an extra 60 operations each time, so in this case it would only pay off if my function had more than 10 comparisons in it already.
What is the O() all about anyway? (sorry, like I said, never took analysis of algorithms). n is the number of elements that must be sorted right? log n seems way too small then.
Interesting stuff, anyway.
I could cut a few levels of recursion by adding in a test to see if it's already sorted (so it wouldn't go ALL the way down - if everything was sorted at the 60th recursive call, it would terminate, instead of at the 66th)...the problem with that is then I am performing an extra 60 operations each time, so in this case it would only pay off if my function had more than 10 comparisons in it already.
What is the O() all about anyway? (sorry, like I said, never took analysis of algorithms). n is the number of elements that must be sorted right? log n seems way too small then.
Interesting stuff, anyway.