Your skill will accomplish what the force of many cannot |
|
PerlMonks |
Re: Quickest way of reading in large files (while v. for)?by CheeseLord (Deacon) |
on Aug 20, 2001 at 08:33 UTC ( [id://106130]=note: print w/replies, xml ) | Need Help?? |
I'd have to say the while version's quicker, simply because it's not going to hog tons of memory storing the the entire file in an array as the foreach version will. In fact, just the other day, somebody posted some code that addressed this problem - after changing to a while loop, execution time was cut by over 40%. From the "I forgot to mention this" dept.: If your file is over 100 meg, not only will foreach be a lot slower, it may not even finish running, due to the memory issue I describe above. I highly recommend using a while loop for a file that large. [Update: Changed title to match root node] His Royal Cheeziness
In Section
Seekers of Perl Wisdom
|
|