Beefy Boxes and Bandwidth Generously Provided by pair Networks
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

sending multiple commands

by didess (Sexton)
on Jan 07, 2011 at 16:03 UTC ( [id://881100]=perlquestion: print w/replies, xml ) Need Help??

didess has asked for the wisdom of the Perl Monks concerning the following question:

Hi all !

I'm trying unsuccessfully to do with Perl what I do often with shell on various tastes of ...X (AIX, LinuX, ...), no need on Windows.

Sending several commands in parallell in background, then waiting for the end of all the processes.

It helps me a lot to reduce drastically execution time when managing numerous servers through ssh commands.

Example in shell :

export PIDS="" for P in $PARAMS do some-command $P > /tmp/some-file.$P & PIDS="$PIDS $!" done wait $PIDS ... continue work ...
I think it should be much better with Perl, but up to now I can't success

Any hint ?

Thanks to all of you

Replies are listed 'Best First'.
Re: sending multiple commands
by sundialsvc4 (Abbot) on Jan 07, 2011 at 16:18 UTC

    The way to do this ... is to do what the shells do:   spawn multiple processes and wait for all of them to complete.   I suggest that you start your search with Parallel::ForkManager.

    There are many more-sophisticated workflow management tools available, e.g. POE, so as you are contemplating your requirements try to articulate them carefully.   It is very unlikely that you will have to “write anything new,” even when building a very complex application of this general type, so if you find yourself thinking that you do, step back and keep looking in http://search.cpan.org (and of course, right here).

Re: sending multiple commands
by mirod (Canon) on Jan 07, 2011 at 16:20 UTC

    You could use "naked" fork for this, but it looks like Parallel::ForkManager would let you write the code pretty much the way you wrote it in your example.

Re: sending multiple commands
by Anonyrnous Monk (Hermit) on Jan 07, 2011 at 16:28 UTC

    There are various modules on CPAN for this.  One of them is Proc::Background:

    use Proc::Background; my @params = ... my @procs; for my $p (@params) { my $cmd = "some-command $p > /tmp/some-file.$p"; push @procs, Proc::Background->new($cmd); } $_->wait for @procs; # ... continue work
Re: sending multiple commands
by sundialsvc4 (Abbot) on Jan 08, 2011 at 17:19 UTC

    One further thought:   if you have a lot of these commands to issue, set up a thread-safe FIFO queue (or pipe, as the case may be) that you push the commands into, and set up a limited number of worker threads/processes that pop work-requests off of that queue.   The queue provides the necessary “flexible hose” that keeps the system from over-committing itself if a sudden flood of requests appears.

    Simple shell-scripts sometimes do not take this into consideration, and the not-so affectionate name for what happens (sometimes used as a denial-of-service attack) is a “fork bomb.”

      Parallel::ForkManager (which you originally suggested) provides a simple way to specify the maximum number of parallel processes — no need to reinvent the wheel.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://881100]
Approved by Corion
Front-paged by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others perusing the Monastery: (6)
As of 2024-04-19 10:17 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found