Syntactic Confectionery Delight | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
All:
I have spent the last 3 weeks converting a suite of shells scripts to Perl. The purpose of this can be found here, although two of my initial requirements changed after a very long hard look at the transient files I was checking against.
The following is the final code
The traplist file that the data is read from looks like:
Created Expires Use Type Author Size Name Trap 07:36:56-07:36 07:36:56-07:36 1 0 XYZ 98765 SIZE N/A 07:36:56-07:36 07:36:56-07:36 1 0 XYZ N/A TRAP1 cool things to look for
If arg1 = blah, you would look for the traplist file in /var/spool/wt400/gateways/blah
Ok, so without further ado - here is my problem: I need to have about 20 copies of the exact same script running where the only difference is the two arguements past to it because there is a race condition beyond my control and now I am using way more memory than the shell scripts ever were. I compared:
I know where the gap is coming from and I could handle the difference for everything else I gained if it were only one copy, but that difference gets multiplied by every copy running (about 20). The only thing that comes to mind is Threads, but I have heard such conflicting information I didn't even consider it when I started the port. Do I have to abandon my code or is there a way to take advantage of my multi-proc high end server to have one or maybe two or three handle all the directories??? Thanks in advance - L~R In reply to 3 weeks wasted? - will threads help? by Limbic~Region
|
|