Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Perl limits on number of open files?

by moof1138 (Curate)
on May 14, 2004 at 23:00 UTC ( [id://353515]=perlquestion: print w/replies, xml ) Need Help??

moof1138 has asked for the wisdom of the Perl Monks concerning the following question:

I am doing some stress testing on a system. I am running 5.8 on Mac OS X. One of the things we need to do is open up a very large number of files. I figured using perl there would be an easy way to open 50,000 files and write to them once in a while. But I ran into two questions.

First is there a way to make an array of filehandles? I was in a hurry so I just programmatically generated a script with a bunch of opens, but this seems lame. The script basically looks like this:
mkdir "/tmp/lotsofopenfiles$$"; open (FILE1, ">>/tmp/lotsofopenfiles$$/1") || warn "$!"; print FILE1 "0"; open (FILE2, ">>/tmp/lotsofopenfiles$$/2") || warn "$!"; print FILE2 "0"; #etc...
Second when I run this I seem to be running into an open file limit of 250 files per process. I have set ulimit to unlimited, and tried running as root but didn't get past this. I looked in the docs to see if I could find anything on open file limits, but didn't find anything. Does anyone know what might be going on here? Here is the error:
Too many open files at openlotsoffiles.pl line 510. print() on closed filehandle FILE252 at openlotsoffiles.pl line 511. Too many open files at openlotsoffiles.pl line 512. print() on closed filehandle FILE253 at openlotsoffiles.pl line 513.
Any advice in terms of alternate approaches etc. would be appreciated.

Replies are listed 'Best First'.
Re: Perl limits on number of open files?
by theorbtwo (Prior) on May 14, 2004 at 23:17 UTC

    As far as making an array of filehandles, the answer, using "bareword filehandles" like you have is no, or at least not an easy way. OTOH, using IO::File allows you to open things and create objects, which are much easier to deal with. (Technicaly, the two are not as disseperate as they first appear -- there's lots of interesting things that go on under the covers. Don't worry about it for now.)

    As to the open files limit, I feel decently secure in saying that it's not perl's fault, but OSX's. Googling for "too many open files" OSX may (or may not) help.

Re: Perl limits on number of open files?
by dave_the_m (Monsignor) on May 14, 2004 at 23:20 UTC
    The only limit on the number of open files is whatever underlying limits your OS imposes. I'm not familar with OSX, but with some OSes it's just a case of messing with ulimit; others require setting a kernel paramter in say /etc/system; others require recompiling the kernel.

    For an array of filehandles, the following demonstrates this; it prints the first line from each of 3 common system files:

    my @fhs; for my $file (qw(/etc/hosts /etc/rpc /etc/services)) { my $fh; open $fh, $file or die "open $file: $!\n"; push @fhs, $fh; } print scalar <$_> for @fhs;
    I believe there is a Perl module that maintains a cache of open filehandles, but I can't for the life of me remember what it's called. No doubt someone else will.
Re: Perl limits on number of open files?
by moof1138 (Curate) on May 15, 2004 at 00:00 UTC
    I figured it out. I had set ulimit to 'unlimited' which didn't work. When I set it to a particular number it worked... I used IO::File to make an array of filehandles, and that works much better. Thanks theorbtwo, and everyone else.
Re: Perl limits on number of open files?
by jonadab (Parson) on May 14, 2004 at 23:51 UTC

    I'd be pretty surprised if this were a Perl limit, unless it's one of those things you configure at compile time (when you compile perl I mean) or is a limitation of the compiler that compiled perl or the standard C library it was compiled with or somesuch like that.

    More likely I suspect you're hitting an OS limitation. This does surprise me a little coming from OS X, but it's not unheard-of. Earlier versions of MacOS had a global system-wide limit, and it was preposterously low. I think 9.0 *raised* the limit to like 256 or maybe 512, and again that's global for all apps combined. If OS X limits *each process* to 250, that seems inane, but it's still an improvement.

    My perl here can open a thousand files for output (5.8.1 on Mandrake 9.2), but it doesn't seem to be able to go much higher than that. "Inappropriate ioctl for device", $! tells me on about the 1022nd file. Again, this seems bad, but I doubt it's perl's fault.


    ;$;=sub{$/};@;=map{my($a,$b)=($_,$;);$;=sub{$a.$b->()}} split//,".rekcah lreP rehtona tsuJ";$\=$;[-1]->();print
Re: Perl limits on number of open files?
by tilly (Archbishop) on May 16, 2004 at 20:30 UTC
    For future reference, you can address this problem with the core module FileCache.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://353515]
Approved by Paladin
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others musing on the Monastery: (7)
As of 2024-04-18 02:49 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found