http://qs321.pair.com?node_id=749363


in reply to Limiting a glob

perl -e 'while ($s=<*>) { print $s,"\n"; sleep 5 }'

As you can see a glob can read filenames one file at a time (hopefully perl really buffers the filenames in the background and doesn't read them in all at once). You might change it to something like this:

use strict; use warnings; my @files; my $filecount=0; while (my $s=<*>) { push @files, $s; if (++$filecount>=100) { DoTheMoveWith(@files); $filecount=0; } } DoTheMoveWith(@files) if (@files);

Replies are listed 'Best First'.
Re^2: Limiting a glob
by tirwhan (Abbot) on Mar 09, 2009 at 18:13 UTC
    (hopefully perl really buffers the filenames in the background and doesn't read them in all at once)

    No, perl reads the entire list of files returned by glob into memory at once. Run the following to test this (WARNING may run for a long time or cause system resource problems on weakish machines).

    #!/usr/bin/perl use strict; use warnings; mkdir "gtest" or die "screaming"; for (1..40000) { open my $f,">","gtest/$_" or die "gnashing"; close $f or die "howling"; } my $c=0; while (my $r=glob("gtest/*")) { $c++; if ($c == 1) { for (1..40000) { unlink "gtest/$_" or die "wailing" } } } print "$c\n"; rmdir "gtest" or die "exhausted";

    I believe this is done so that glob returns a consisten snapshot of the directory contents as they were at some point, regardless of whether the content changes while you process the results. If you want more up-to-date data, with only current files being returned, you'll have to use opendir and readdir.


    All dogma is stupid.
      perl reads the entire list of files returned by glob into memory at once.

      This is what I originally assumed, hence my original question. So, I guess that makes the answer to my question, "No, you can't set a limit on a glob."

      Thanks

Re^2: Limiting a glob
by zod (Scribe) on Mar 09, 2009 at 17:51 UTC
    Thanks. This illustrates a basic misunderstanding I had of the way glob actually works internally.