Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Re: Perl file operations

by jwkrahn (Abbot)
on Feb 23, 2008 at 03:23 UTC ( [id://669713]=note: print w/replies, xml ) Need Help??


in reply to Perl file operations

While I don't have a directory with 60,000 files in it to test your code I would guess that  <*.txt> is creating such a large list that it runs out of memory.    Try using opendir and then readdir in a while loop to cut down on memory usage:

opendir my $DH, '.' or die "Cannot open the current directory: $!"; while ( my $file = readdir $DH ) { next if '.txt' ne substr $file, -4;

Replies are listed 'Best First'.
Re^2: Perl file operations
by chromatic (Archbishop) on Feb 23, 2008 at 04:42 UTC
    While I don't have a directory with 60,000 files in it to test your code I would guess that <*.txt> is creating such a large list that it runs out of memory.

    At an estimate of 64 bytes of overhead per SV, 60,000 strings produced by that list will take up 3.6 megabytes of memory, not counting the lengths of the filenames.

      Monks,
      I am stupid. The file extensions in the directory were different that I was expecting.
      I fixed the file extension, and it slowly but surely processed the 60,000+ files into their respective directories. I watched memory usage on the successfull run, and it hovered between 60 and 70 megabytes, using Activeperl.
      Please dont harm me :(
        We'll hold off on the crucifixion.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://669713]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (6)
As of 2024-04-16 11:45 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found