Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number
 
PerlMonks  

Re: Find Duplicate Files

by kingman (Scribe)
on Jul 04, 2002 at 20:56 UTC ( [id://179535]=note: print w/replies, xml ) Need Help??


in reply to Find Duplicate Files

Hi, I wrote a command-line utility using your module that makes is easy to delete duplicate files.
#!/usr/bin/perl -w use strict; use File::Find::Duplicates; $|++; # AutoFlush the Buffer &usage if $#ARGV eq '-1'; my %dupes = find_duplicate_files(@ARGV); die "No duplicates found!\n" unless keys %dupes; print "############ Duplicate File Report & Removal Utility ########## +##\n"; my $i = 1; foreach my $fsize (keys %dupes) { print "#" x 64 . " " . $i++ . "\n"; print map {-l $_ ? "# push \@delete, '$_'; # symlinked to " . read +link($_) . "\n": "# push \@delete, '$_';\n"} @{ $dupes{$fsize} }; print "\n"; } print "unlink \@delete;\n"; sub usage { (my $script_name = $0) =~ s#.*/##; # $0 = full path to script print <<END; Generates a Report on Duplicate Files. Usage: $script_name [List of Directories] END exit } ### POD ### =head1 Name dupes - a command line utility to report on all duplicate files, even +if they have different names. This is good for mp3s and multiple drafts of do +cuments that may have been backed up in different places. =head1 Synopsis dupes [list of directories to search recursively] =head1 From an empty buffer in Vim The following commands will fill the buffer with a report of all dupli +cate files. :%!dupes [list of directories] B<or> !!dupes [list of directories] The report generated by the above commands is yet another perl script +that can be edited allowing you to flag certain files for removal. The following command will run the report and remove all flagged files +. :%!perl Nothing is deleted unless you flag the file by uncommenting the line. If you don't understand how the report works, the following commands s +hould explain it. perldoc -f push perldoc -f unlink =head1 AUTHOR Kingsley Gordon, E<lt>kingman@ncf.caE<gt> last modified: Thu Jul 4 15:11:26 EDT 2002 =cut

Replies are listed 'Best First'.
Re^2: Find Duplicate Files
by Anonymous Monk on Aug 09, 2008 at 23:32 UTC
    It would be nice if the script deleted the duplicates but later created a hard link from the original to the deleted file. That way you don't waste any space and you have no risk of breaking anything.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://179535]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others perusing the Monastery: (6)
As of 2024-04-19 05:59 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found