"be consistent" | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Looking at your code, you appear to be processing one the other or both of the directories 'in/*' & 'out/*' relative to the path "/var/spool/wt400/gateways/" . $ARGV[0]. Presumably each of your 20 copies of the script is processing a different subtree of /var/spool/wt400/gateways/? In which case, you could do your initial chdir to and to your globing as <*/in/*> etc. and process all the files from the 20 subdirs in one loop. I notice that you have a sleep 3 in your main loop, which probably means that your not utiliting much of the cpu as it stands, so you should have easily enough processor to cope with the 20 dirs in the main loop. You might need to change that sleep to sleep 3-$time_spent_last_pass. I realise that the traplist file is different for each subtree, but the <*/in*> form of the glob return the filenames in the form subdir/in/file so you can then split on the /'s and extract the subdir and use this as the first key in your %traps hash to select the appropriate set of traps information for the file. It means re-working your code a somewhat, but probably less work than moving to either use Threads; or fork. Examine what is said, not who speaks. The 7th Rule of perl club is -- pearl clubs are easily damaged. Use a diamond club instead. In reply to Re: 3 weeks wasted? - will threads help?
by BrowserUk
|
|