P is for Practical | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
It feels wrong because it is wrong. :)
The way to make your CGI faster is to have it compile once and stay in memory-- hence mod_perl and the like. No other optimization can beat that. But your idea for a way of working is interesting, and exists for a reason. Probably just not this reason. It allows for things like Caillte's example of a program that is stored non-traditionally, or even mutates itself without having to restart. It allows existing scripts to be called from new scripts without having to incorporate their internals into your new script. It probably allows for lots of things, I just can't see using it to make a CGI load faster-- since it is fastest to not load the CGI at all (or, really, to load it once and leave it there). And if your CGI is so huge that it is noticable each time it loads and mod_perl is just not an option, maybe you could break your large script into separate smaller scripts, isolating common elements into a module that can be used from each script. But in practice that may not be any more fun that what you've proposed. In reply to (ichimunki) Re: Saving compile time by running subroutines as separate files with 'do'
by ichimunki
|
|