Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw
 
PerlMonks  

Re: Speeding up commercial Web applications

by tachyon (Chancellor)
on May 08, 2003 at 04:00 UTC ( [id://256470]=note: print w/replies, xml ) Need Help??


in reply to Speeding up commercial Web applications

6-7 seconds processing is an awesome amount of time unless your machine is dog slow. First have a look with top to see if you are running out of RAM and getting into the swap space this will really kill you for speed. If so add more RAM (its cheaper than hacking time). Perl loves RAM and so do DBs for that matter. You really can't throw too much RAM at either.The more RAM a DB has available the more data and indexes it will cache. Most of my servers have at least 2GB.

You must be doing a lot of post processing on the DB output to take 6-7 seconds. Don't. Sorting, Grouping, Joining, Selecting are all tasks that should be done at the DB level (fast and in C) not the application level (slower in Perl). This of course may entail major recoding. If you are only displaying X results then add a LIMIT clause so you only get that many rows - quicker to get, less to post process for a minor change.

You must have a lot of loops to be consuming this much time. Rewrite them to either exit as soon as they can ie:

# slow print "Found" if grep{ /$find/ } @list # always faster and uses less memory for (@list) { do{ print "Found"; last } if /$find/; }

or preferably do it at DB level as noted.

On the subject of loops you can spend memory and gain speed by building hash lookup tables rather than iterating over lists.

If you are doing sorts that can't be done on the DB with an ORDER BY clause you may need a Schwartzian Transform depending on the application.

If you are calling external processes or including dynamic content you will get benefit from caching static results, updating them as required, and using these.

PerlRun is an option but all it really does is save a bit of time on the compile and load which is apparently not your issue given the time breakdown. It will help and is the simplest option if the code will run OK under it.

cheers

tachyon

s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print

Replies are listed 'Best First'.
Re: Re: Speeding up commercial Web applications
by Willard B. Trophy (Hermit) on May 08, 2003 at 14:31 UTC
    Though I agree about the RAM thing, using DBMS functions to process the data is not always faster. I've found SELECT DISTINCT to be, on occasion, many times slower than feeding the output of a simple SELECT into a hash.

    This may have been an unusual case. I was retrieving distinct values of a single column from a table with about 125,000 rows from Empress, which isn't that much of a mainstream system. Still, I seem to remember using the Perl hash was more than 20x faster that letting Empress do the SELECT DISTINCT thing.

    --
    bowling trophy thieves, die!

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://256470]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others having a coffee break in the Monastery: (3)
As of 2024-04-20 14:52 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found