Welcome to the Monastery | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Fair enough....
In my readmore tags - I explained that each copy works on a different directory - each directory is very transient and that is a race condition beyond my control
The extra memory overhead is coming from the Perl intrepreter - not the code itself (or at least that is my belief) - see below:
The above code shows up in ps -el with almost the same sz as the code in my readmore tags Forking will not buy me anything as I understand it since I will be making an exact duplicate (memory and all). I was thinking threads may help, but as I understand them - each thread gets its own copy of the intrpreter - no memory savings either So my question stated more clearly is:
Given a piece of code to parse a single directory, how can I parse multiple directories concurrently (or very nearly) without the memory overhead of each piece requiring its own intrepreter?
I freely admit that I may be asking to get something for nothing, but it seems like an awful waste not to be able to use the Perl code and continue using the shell script :-( Cheers - L~R In reply to Re: Re: 3 weeks wasted? - will threads help?
by Limbic~Region
|
|