http://qs321.pair.com?node_id=320779

ibanix has asked for the wisdom of the Perl Monks concerning the following question:

Hello monks,

I've got a Windows2K file server with over 2 million pdf files. We want to delete any of these pdfs more than X days old (X seems to keep changing). So, I wrote up a script (below) to automate this; I'm using ActivePerl 5.6.

My problem is that the script seems to cause the system to eat up all it's "system cache" memory! The perl process only seems to use nominal amount of memory. I blame Windows 2000, but has anyone else seen this problem? Anything in my script that stands out as bad?

### Delete pdf files older than X days from given directory tree ### use strict; use warnings; use File::Find; my @directories; $directories[0] = $ARGV[0]; my $days_old = $ARGV[1]; finddepth(\&wanted, @directories); # File::Find coderef # Find all PDFs older than given # of days & delete sub wanted { # Turn forward slashes into backslashes to make real Win32 paths my $file = $File::Find::name; $file =~ s|/|\\|g; if ( ($file =~ m|\.pdf$|) && (int(-M $file) > $days_old) ) { print "Found: $file, deleted\n"; unlink($file) || print "Unable to delete $file!\n"; } }


Thanks in advance,
ibanix

$ echo '$0 & $0 &' > foo; chmod a+x foo; foo;