Date: Wed, 6 Mar 96 13:57:24 CST From: rusin (Dave Rusin) To: dtorgers@gulf.uvic.ca Subject: Re: an algorithm? Newsgroups: sci.math In article <4hkl6d\$2mpt@uvaix3e1.comp.UVic.CA> you write: >I am looking for an algorithm: > > Given n elements in a list it will find the square root n largest >elements. For example, if there are 100 integers, the algorithm will find the >10 largest elements. In practice it seems easiest to sort the whole list in whatever way suits you, keeping only the sqrt(n) largest values as you go. It would have to be a pretty big list, or a situation with time at an absolute premium, to worry about anything else. This can be done with O(n logn) moves. I can suggest variants which might improve the implied constant, but they all seem to be still O(n log n) in speed. For example, you could break the data into groups with sqrt(n) items in each; sort those groups; then go group-by-group looking for the best-so-far. If you can sort a list of K objects in time A*K log K, then the first pass requires time A/2 * n * log(n); comparing a (sorted) best-so-far list with a sorted group takes time no longer than the sum of the lengths, i.e. 2 sqrt(n), so all the sqrt(n) groups may be compared in time much less than n log(n). Thus, for large n, it's the first pass which costs the most time. But you have only saved (1/2) of the time A * n * log n it would have taken just to sort the whole list of n items. If you really did have n=100 and just needed to find the ten largest ASAP, I suppose I would do something just like that -- the most easily coded sorts for 10 objects would take very few machine cycles, which could be done 10 times, each time incorporating the result into the best-so-far list. Seems like a couple thousand machine instructions would do it. dave