Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Re^2: Derangements iterator (duplicates)

by tye (Sage)
on Jan 08, 2008 at 15:35 UTC ( [id://661109]=note: print w/replies, xml ) Need Help??


in reply to Re: Derangements iterator
in thread Derangements iterator

Update If one needed to allow for deranging a list which contains duplicates, one could simply derange the list of its indices.

Um, not really. A derangement algorithm that "doesn't handle duplicates" is one that deranges "1 4 4" into "4 1 4" because it doesn't notice that the duplicates are the same and so doesn't realize that replacing the last 4 with the second 4 didn't actually cause the last item to be different. It also, when deranging "1 1 2 2" returns "2 2 1 1" four times instead of just once because it doesn't realize that reversing "1 1" (nor "2 2") gives the same thing.

So your first algorithm simply breaks when presented with duplicates (it always finds zero derangements, for those who didn't notice) but your work-around just prevents this breakage while not actually correctly handling the duplicates. (So I'd suggest you document how it doesn't handle duplicates rather than suggest that work-around.)

Just for fun, here is your algorithm modified to correctly handle duplicates:

sub _derange { my( $cb, $av, $todo, @i ) = @_; return $cb->( @$av[@i] ) if ! @$todo; my( %iseen, %vseen ); @iseen{@i}= (); @vseen{@$av[@i]}= @i; my( $range, @todo )= @$todo; for( @$range ) { _derange( $cb, $av, \@todo, @i, $_ ) if ! exists $seen{$_} and ! exists $vseen{$av->[$_]} || $vseen{$av->[$_]} < $_; } } sub derange(&@) { my $cb= shift @_; _derange( $cb, \@_, [ map { my $x = $_[$_]; [ grep { $_[$_] ne $x } 0..$#_ ] } 0..$#_ ], ); } derange( sub { print "@_\n" }, 1,1,2,2,3 );

Of course, if you try to look up a definition for "derangement", you won't find anything that makes much sense when considering duplicates because mathematicians define derangements in term of permutations which they define without considering duplicates either (though they usually don't use language that actually makes that clear, either).

But the extension of these two concepts to cover lists with duplicate elements is natural, even obvious, as well as useful.

- tye        

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://661109]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others wandering the Monastery: (6)
As of 2024-04-19 10:58 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found