Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw
 
PerlMonks  

Re: Speeding up commercial Web applications

by BrowserUk (Patriarch)
on May 08, 2003 at 02:29 UTC ( [id://256451]=note: print w/replies, xml ) Need Help??


in reply to Speeding up commercial Web applications

Profile. See Devel::DProf. Concentrate your efforts at the bottlenecks. If you find slow bits that you can't see how to improve, post them.

If your reading files, templates and the like from disk, create a ram drive and load them from there.

NB: This is highly speculative! If there are truely lots of globals in use, you might get a little from making them lexicals by declaring them at the top of each file using our, assuming your using a version of perl that supports this.

Beyond that, your the man who can see the code. What's it doing that takes so long? PROFILE:)


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller

Replies are listed 'Best First'.
Re: Re: Speeding up commercial Web applications
by grantm (Parson) on May 08, 2003 at 09:02 UTC
    you might get a little from making them lexicals by declaring them at the top of each file using our

    It my understanding from this doc that our creates globals. The perl561delta.pod document mentions that they "... can be best understood as a lexically scoped symbolic alias to a global variable" (which I must confess did not assist my understanding one iota :-) ). But 'our' variables do create symbol table entries with the associated typeglob memory footprint overheads.

      Hence the speculation.

      It was pointed out to me that, in some circumstances, that using lexicals rather than globals is quicker. One explaination I saw, but cannot now find, is that it is quicker to find a lexical than a global? I took this to be something to do with the fact that the compiler has to know where a lexical is at compile time, and effectively hard codes the 'location' into the optree, but that globals are 'found' at each time, at runtime. The technical detail my be wrong but this simplified imagery "works for me" ... until a better explanation is available.

      How far this case extends I have never tried to assertain as I have found very few case where I use globals, and in the few cases I do, they are never citical to performance, but the the OP seemed to be looking for 'simple' measures that might help. Adding use strict at the top of a module will rapidly spit out the names of the globals. Adding one line, our ($a, $b, $this, $that); with an appropriate comment seem a simple enough change to at least make it worth trying, given the lack of other information and the restrictions imposed by the OP's question.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller
Re: Re: Speeding up commercial Web applications
by PotPieMan (Hermit) on May 08, 2003 at 02:44 UTC
    I have profiled, more with timestamping than anything else. Sorry for not stating this in my original writeup.

    I guess this is more a question of what to do if you have profiled, proven that the application is at fault, and probably won't be able to run it in mod_perl. Anything but throw more powerful processors at it or buy/write a new application?

    Thanks for your input, BrowserUk and Abigail-II.

      Sometimes, but only sometimes, small changes to the code can reap large rewards. As an example, if the application truely is Perl 4 code, then I believe (but could be wrong) that Perl 4 didn't support hashes. Assuming for the moment that is true -- I'll be quickly corrected if it isn't:) -- and (for example) the application does any sort of lookups into arrays of data using grep, then changing the array(s) being search linearly to hash(es) could have a dramatic effect without too much effort.

      I'm not really sure I understand the reluctance to modify. You said this is because you hope to upgrade to the next version sometime in the future. If you make changes now, how does that stop you upgrading? The only reason refactoring the code (ie. not changing what the code does, but only the way it does it) would impact your upgrade, is if you discovered that the later version wasn't as efficient as your modifed version. In which case you would have very strong grounds for requesting that your changes be fed back into the latest version before you took delivery. The supplier might even thankyou for it and reduce your bill (some chance:). Their customers almost certainly would thankyou.

      At the end of the day, if you change nothing, nothing will change. If you can't change the code, then you already know the other options. More memory, a faster processor, harddisks etc.

      I think you already knew this though, so it begs the question, what were you hoping for?

      Oh! You have already done the Monk's ritual haven't you?


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller

        Take three jars of incense, two large black candles, a mag tape containing the sources, floppies or a cd might work, to a virgin piece of consecrated ground, within earshot of the monastery bells (not sure if it would work via internet radio??), at dawn on the third Tuesday after the first Monday before Ash Wednesday, by the Gregorian calandar (See Date::Manip, but take 3 strong men with you) ....









        Moveed here from my scratchpad to free its use for other things and maintain the link.

        Perl 4 did have hashes but not references (or at least very limited support for references). Nested data structures were out of the question. I have some Perl 4 code I still use that uses join and split a lot to accomplish what I'd do today with an HoH.

        90% of every Perl application is already written.
        dragonchild
        I think you already knew this though, so it begs the question, what were you hoping for?

        A magic wand, a different angle, anything that I hadn't thought of. Can't you tell? I'm frustrated.

        Nevertheless, thanks for your help. We will probably end up delaying until new hardware is in the budget. :-/

      Well, your major concern seems to be avoiding large changes to ensure that you can hope to upgrade - have you tried talking to the vendor, to see if either they are already working on speed improvements, or would consider rolling your changes into any revision of the product?

      Hugo
        Yes, but their timeline does not match up with ours. We are on our own as far as modifications go.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://256451]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chilling in the Monastery: (6)
As of 2024-03-28 21:41 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found