I am doing some stress testing on a system. I am running 5.8 on Mac OS X. One of the things we need to do is open up a very large number of files. I figured using perl there would be an easy way to open 50,000 files and write to them once in a while. But I ran into two questions.
First is there a way to make an array of filehandles? I was in a hurry so I just programmatically generated a script with a bunch of opens, but this seems lame. The script basically looks like this:
mkdir "/tmp/lotsofopenfiles$$";
open (FILE1, ">>/tmp/lotsofopenfiles$$/1") || warn "$!";
print FILE1 "0";
open (FILE2, ">>/tmp/lotsofopenfiles$$/2") || warn "$!";
print FILE2 "0";
#etc...
Second when I run this I seem to be running into an open file limit of 250 files per process. I have set ulimit to unlimited, and tried running as root but didn't get past this. I looked in the docs to see if I could find anything on open file limits, but didn't find anything. Does anyone know what might be going on here? Here is the error:
Too many open files at openlotsoffiles.pl line 510.
print() on closed filehandle FILE252 at openlotsoffiles.pl line 511.
Too many open files at openlotsoffiles.pl line 512.
print() on closed filehandle FILE253 at openlotsoffiles.pl line 513.
Any advice in terms of alternate approaches etc. would be appreciated.