http://qs321.pair.com?node_id=1011886


in reply to Evolving a faster filter?

In the “further food for thought” department, it occurs to me that this is not entirely a pure-combinatorics problem.   One filter might appear to be “better” in one situation than it would have been in another, i.e. because another filter did precede it, and that filter did remove elements perhaps more-efficiently than it would have done.   A key factor of consideration might be ... and only you could have the answer to this ... is the most important goal that the result-set be reduced to its minimum practicable size, or that the filter-set should reduce the result-set as far as it can manage to do within a strictly limited amount of CPU time?   How predictable and consistent are these “costs,” and upon what do they depend?

If you can by any means see your way to making this a true optimization problem then by all means do that ... but I guess that you can’t because, if you could, you probably would not have a choice of more-than-one filter.   It is quite an interesting sort of problem and my guess is that the solution of choice will be a heuristic rather than an algorithm much less a proof.   You’ll come up with a self-adapting method that produces good-enough results and that does so consistently, perhaps never knowing whether a few milliseconds more could have been shaved at any particular instant.