http://qs321.pair.com?node_id=1211106


in reply to Re: Perl Program to efficiently process 500000 small files in a Directory (AIX)
in thread Perl Program to efficiently process 500000 small files in a Directory (AIX)

File slurp cannot make that kind of speed improvement
  • Comment on Re^2: Perl Program to efficiently process 500000 small files in a Directory (AIX)

Replies are listed 'Best First'.
Re^3: Perl Program to efficiently process 500000 small files in a Directory (AIX)
by rminner (Chaplain) on Mar 17, 2018 at 08:02 UTC
    Actually it did exactly give this performance gain. I changed the reading mechanism after benchmarking the programm. The benchmarks showed me that more than 90% of the real runtime of my programm was spent on I/O. The files were however larger (a few thousand xml files), and were located on a normal HDD not an SSD. He could simply try and see, whether it changes anything for him.
      :) there aint no way the OPs way of slurping is 15 times slower (45/3) than any of the File::Slurp... modules

      yes i believe you saw time reduction ... Whatever part slurp module played was by accident. Its not gonna provide 15x speedup