in reply to Re: Evolving a faster filter? (code)
in thread Evolving a faster filter?

Heh, I just had what might be a great idea on how to do the first sort.

It is trivial to do a two-item optimization in your sort comparison routine. This is "cheating" because you can end up telling sort that you want $a < $b and $b < $c but $c < $a. On old versions of Perl, that might even cause a core dump. I think it is actually documented as safe on modern Perl, however.

my @order = sort { $Costs[$a]+$Trims[$a]*$Costs[$b] <=> $Costs[$b]+$Trims[$b]*$Costs[ +$a] } 0..$#Costs; @Trims = @Trims[@order]; @Costs = @Costs[@order];

It is that simple. It worked reasonable well in a couple of test cases. It even found an optimal solution on the first try for my prior example above (but took too long to declare success, probably indicating a bug).

- tye        

Replies are listed 'Best First'.
Re^3: Evolving a faster filter? (optimal solution?)
by LanX (Sage) on Jan 04, 2013 at 21:22 UTC
    my @order = sort { $Costs[$a]+$Trims[$a]*$Costs[$b] <=> $Costs[$b]+$Trims[$b]*$Costs[ +$a] } 0..$#Costs;

    I'm pretty sure that this is the optimal solution, but I'm too lazy to write down the complete mathematical proof.

    The basic idea is that the terms represent the general cost difference of swapping the first and the second filter╣!

    So by sorting you get a solution which can't be improved by a pairwise swapping and any permutation can be represented by a sequence of pairwise swaps.

    Cheers Rolf

    ╣) see restructured formula for c in Re^2: Evolving a faster filter? (combinatorics) and you will notice that these are the first elements and the rest is fix.


    basic transformations lead to a simpler solution:

    my @order = sort { $Costs[$a]/(1-$Trims[$a]) <=> $Costs[$b]/(1-$Trims[$b]) } 0..$#Costs;

    which can be precalculated to save calculation time

    my @order = sort { $weight[$a] <=> $weight[$b] } 0..$#Costs;

    (the edge case of a division by zero must be handled as practically infinity)

      That's not enough to prove it. You certainly get a solution that can't be improved by a single swapping of adjacent filters. And if your sort order using that algorithm is well-defined, then you will get an optimal solution. That is, if you never get the $a < $b < $c < $a case. Update: No, maybe not even that (unless someone can convince me that "of adjacent filters" is not needed above). Update: Ah, ruling out adjacent swaps also rules out any swaps. Update: Silly me. Ruling out swapping of elements that start out adjacent doesn't rule out swapping of elements that didn't start out adjacent (but became adjacent from swapping elements that did start out adjacent).

      I thought I had already found a case where it wasn't optimal. But the total cost was reported as identical as the first cost despite a different ordering being chosen. So it might be hard to find a case where it isn't optimal.

      I bet it makes doing anything more than that not worth the effort for Ovid's usage, at least. :)

      - tye        

        Well my argument is sufficient to show that a solution where two adjacent filters don't follow this order can't be optimal, because otherwise swapping those adjacent filters f[i] and f[i+1] would improve the result.

        So any optimal solution must follow this strict order criteria. ╣

        qed! =)

        Cheers Rolf

        PS: I'm glad I didn't start implementing the B&B algorithm :-)


        ╣) and it's easy to see that all ordered solutions (plural b/c adjacent filters can have the same weight) imply the same total cost.