Perl: the Markov chain saw | |
PerlMonks |
Re: Speeding up commercial Web applicationsby tachyon (Chancellor) |
on May 08, 2003 at 04:00 UTC ( [id://256470]=note: print w/replies, xml ) | Need Help?? |
6-7 seconds processing is an awesome amount of time unless your machine is dog slow. First have a look with top to see if you are running out of RAM and getting into the swap space this will really kill you for speed. If so add more RAM (its cheaper than hacking time). Perl loves RAM and so do DBs for that matter. You really can't throw too much RAM at either.The more RAM a DB has available the more data and indexes it will cache. Most of my servers have at least 2GB. You must be doing a lot of post processing on the DB output to take 6-7 seconds. Don't. Sorting, Grouping, Joining, Selecting are all tasks that should be done at the DB level (fast and in C) not the application level (slower in Perl). This of course may entail major recoding. If you are only displaying X results then add a LIMIT clause so you only get that many rows - quicker to get, less to post process for a minor change. You must have a lot of loops to be consuming this much time. Rewrite them to either exit as soon as they can ie:
or preferably do it at DB level as noted. On the subject of loops you can spend memory and gain speed by building hash lookup tables rather than iterating over lists. If you are doing sorts that can't be done on the DB with an ORDER BY clause you may need a Schwartzian Transform depending on the application. If you are calling external processes or including dynamic content you will get benefit from caching static results, updating them as required, and using these. PerlRun is an option but all it really does is save a bit of time on the compile and load which is apparently not your issue given the time breakdown. It will help and is the simplest option if the code will run OK under it. cheers tachyon s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print
In Section
Seekers of Perl Wisdom
|
|