DenairPete has asked for the wisdom of the Perl Monks concerning the following question:

I am at my wits end with this PERL script. Running on AIX. I am Processing a large directory every night. It accumulates around 1 Million files each night, half of which are ".txt" files that I need to process. Each ".txt" file is pipe delimited and only contains 20 records - Record #6 is the one that contains the info I need in order to determine which directory to move the file to (In this case the file would be moved to "/out/4". Example Record: A|CHNL_ID|4 (Third party software is creating these files and they didn't think of including the channel in the file name). As of now this script is processing at a rate of 80,000 files per hour. Is there any recommendations on how I could speed this up?

opendir(DIR, $dir) or die "$!\n"; while ( defined( my $txtFile = readdir DIR ) ) { next if( $txtFile !~ /.txt$/ ); $cnt++; local $/; open my $fh, '<', $txtFile or die $!, $/; my $data = <$fh>; my ($channel) = $data =~ /A\|CHNL_ID\|(\d+)/i; close($fh); move ($txtFile, "$outDir/$channel") or die $!, $/; } closedir(DIR);