in reply to File::Find memory leak

Saw your recent post with this link. You should find that a variation on this will work and it does not leak. It 'recurses' width first using a very useful perl hack - you can push to an array you are iterating over (dont shift or splice but). All it does is push the dirs into its dir list as it finds them. Short, simple and fast.

This builds an array of all the files it finds (full path) but you could stick your &wanted code in there instad and have it return void. With UNC paths you will want to swap the / to \\

sub recurse_tree { my ($root) = @_; $root =~ s!/$!!; my @dirs = ( $root ); for my $dir ( @dirs ) { opendir DIR, $dir or do { warn "Can't read $dir\n"; next }; for my $file ( readdir DIR ) { # skip . dirs next if $file eq '.' or $file eq '..'; # skip symlinks next if -l "$dir/$file"; if ( -d "$dir/$file" ) { push @dirs, "$dir/$file"; } else { push @files, "$dir/$file"; } } closedir DIR; } return \@dirs, \@files; }