Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Re: scalable duplicate file remover

by jwkrahn (Abbot)
on Mar 03, 2008 at 08:20 UTC ( [id://671599]=note: print w/replies, xml ) Need Help??


in reply to scalable duplicate file remover

sub process_file {
    my $dir_configs=$_[0];
    ##optimisation using -d -l -f -s just once for return and also for adding

    #if current "file"(unix terminology) is a directory and the yaml configuration
    #tells us to eliminate directories from the search we do so by returning from the
    #callback
    return if -d $File::Find::name && ! $dir_configs->{dir};

You call stat on the file.

    return if -l $File::Find::name && ! $dir_configs->{link};

You call lstat on the same file.

    return if -f $File::Find::name && ! $dir_configs->{file};

You call stat on the same file again.

    return if -s $File::Find::name < $config->{minsize};

You call stat on the same file again.

    unless($File::Find::name =~  /$dir_configs->{regex}/) {
        if(-d $File::Find::name) {

You call stat on the same file again.

            $File::Find::prune=1;
        }
        return;
    }

    my ($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,
        $atime,$mtime,$ctime,$blksize,$blocks)
    = stat($File::Find::name);

You call stat on the same file again. You declare 13 variables but you are only using one.

    my $last_modif_time=DateTime->from_epoch(epoch=>$mtime);
#    printf "%s %s %s %s\n",
#    $File::Find::name,
#    file2sha1($File::Find::name),
#    -s $File::Find::name,

Commented out but if not you call stat on the same file again.

#    $last_modif_time;

    add_to_db(file2sha1($File::Find::name),$last_modif_time,-s $File::Find::name,$File::Find::name);

You call stat on the same file again. You call add_to_db() which calls stat or lstat three more times.

    #print Dumper $dir_configs;
};

In total you call stat or lstat ten times on the same file (eleven times if you uncomment the printf statement.) You also use $File::Find::name in most places where $_ would have the same effect.

sub process_file { my $dir_configs = $_[ 0 ]; ##optimisation using -d -l -f -s just once for return and also for + adding #if current "file"(unix terminology) is a directory and the yaml c +onfiguration #tells us to eliminate directories from the search we do so by ret +urning from the #callback return if -l && ! $dir_configs->{ link }; # call lstat on current +file to test for symlink my ( $size, $mtime ) = ( stat )[ 7, 9 ]; return if -d _ && ! $dir_configs->{ dir }; return if -f _ && ! $dir_configs->{ file }; return if $size < $config->{ minsize }; unless ( $File::Find::name =~ /$dir_configs->{regex}/ ) { if ( -d _ ) { $File::Find::prune = 1; } return; } my $last_modif_time = DateTime->from_epoch( epoch => $mtime ); # print "$File::Find::name ", file2sha1( $_ ), " $size $last_modif_ +time\n", add_to_db( file2sha1( $_ ), $last_modif_time, $size, $File::Find:: +name ); #print Dumper $dir_configs; }

Replies are listed 'Best First'.
Re^2: scalable duplicate file remover
by spx2 (Deacon) on Mar 03, 2008 at 09:26 UTC
    Thank you very much for the optimisation proposed.
    As I read from -X operators documentation I saw they
    were using " _ " as you are using also.
    Couldn't just one call to stat be performed and get all the information from there(including
    the information provided by the -X operators) ?
      Yes, and in fact that is what I have shown in the example code I posted above.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://671599]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (3)
As of 2024-04-19 02:23 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found