There's more than one way to do things | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
I will be very interested in the outcome of such a study. We are in the process right at this moment profiling some data loading code that takes about 1/2 hour to run on some very beefy hardware (Sun Sparc 4500, 2 gig memory, etc...).
One of the things we are looking at is de-referencing. Assume you have a hash reference that is a complex data structure: Now imagine that the example above is actually a reference to an enormous object, with grand-children, great-grand-children, and great-great-great-great-grandchildren. From what I can see, every time you de-reference some or all of the object, a copy is made. In the case where these are really big objects, as some of ours are, that would be rather rough, especially if we iterate through children, then grand-children, then deeper still. At this point, the savings gained by passing a reference to a function is lost due to the fact that a copy of the referent is made in the function. For instance: In line 7, a deep copy of the list is made. If we change the sub to take not a reference, but the whole list, is this equivalent, or is it different? Of course, if in the subroutine you de-reference the thing more than once, you could be incurring huge overhead. This is not at all an issue with a puny example like the above, but we have rather hierarchical data, with objects that contain child objects, with children, etc. All method calls are by reference, and most references are de-referenced all over the place. I can't send you our code (proprietary, you know) but I might be able to put together a better example to show you what I mean. Right now, one of our guys is looking in to this. I will publish the results of our study here. Brian - a.k.a. DrSax In reply to Re: Perl Optimization
by DrSax
|
|