The /usr/bin/perl executable can take a second to start itself before it even begins to look at the script. That time would be saved.
Also, mod_perl will keep the compiled version of the script in memory so that it doesn't need to be interpreted before running again. If you were to have a large script several thousand lines long, this would save time.
Lastly, since perl is used so much, any time saved each time is lots of time saved overall! | [reply] |
1) The only time the executable is not already in memory is when you type "perl script.pl". If I do a perl -e 'print "Hello World\n";' from my CLI it responds in an eyeblink. This leads me to believe that the only serious overhead concerns are related to interpretation... hence my suggestion that
2) You can keep a script in memory by using a while loop, or you can set up a controller script that uses the other scripts you want precompiled and starts one of them up on command.
3) Check the FAQ for more information on compiling scripts and reducing the size of your perl executable.
| [reply] |
If you're worried about performance this bad, consider turning your script into a daemon of sorts, either listening on a network socket, a unix domain socket or a fifo of some kind.
| [reply] |