http://qs321.pair.com?node_id=1091143

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hello Monks, I want to invoke multiple bash scrtips on different and not worry about the result values. for now `` is waiting for the script to end and get the results. I do not wanna do that.
`ssh -q -o ConnectTimeout=10 $usr1@host1 /tmp/start $V1 $v2`; `ssh -q -o ConnectTimeout=10 $usr2@host2 /tmp/start`;
Can you please help.

Replies are listed 'Best First'.
Re: invoke multiple bash scripts on different machine
by hippo (Bishop) on Jun 25, 2014 at 08:14 UTC

    If you do not want to capture the stdout or stderr from the commands, don't use backticks, use system instead.

    If you do not want to wait for the process to finish, either fork at the local end or background at the remote end and detach.

Re: invoke muliplt bash scripts on different machine
by monkey_boy (Priest) on Jun 25, 2014 at 07:48 UTC
    Could you just run each process in the background with '&'?
    `ssh -q -o ConnectTimeout=10 $usr1@host1 /tmp/start $V1 $v2 &`; `ssh -q -o ConnectTimeout=10 $usr2@host2 /tmp/start &`;



    This is not a Signature...
Re: invoke muliplt bash scripts on different machine
by blue_cowdawg (Monsignor) on Jun 25, 2014 at 13:58 UTC

    Here's a thought:

    #!/usr/bin/perl -w use strict; my $work={ "host1" => { userid=>"user1",args => [ "thing1" , "thing2, "red + thing" , "blue thing" ] } ) "host2" => { userid=>"user2",args=> [ ] } "host3" => { userid => "user3", args=>[] } #etcetera }; foreach $host(keys %$work){ my $uid=$work->{$host}->{userid}; my @args=@{$work->{$host}->{args}}; if ( fork() == 0 ) { my $cmdline=sprintf("ssh %s\@$s /tmp/start %s",$uid,$host,jo +in(" ",@args)); system($cmdline) } } wait() ; Wait here until the children exit exit(0);


    Peter L. Berghold -- Unix Professional
    Peter -at- Berghold -dot- Net; AOL IM redcowdawg Yahoo IM: blue_cowdawg
Re: invoke muliplt bash scripts on different machine
by vinoth.ree (Monsignor) on Jun 25, 2014 at 13:40 UTC

    Hi,

    Login into remote server via ssh and if you start a shell script or command and you exit (abort remote connection), the process / command will get killed.

    Sometime job or command takes a long time. If you are not sure when the job will finish, then it is better to leave job running in background(&). But, if you log out of the system, the job will be stopped and terminated by your shell. What do you do to keep job running in the background when process gets SIGHUP.

    Use nohup command line-utility which allows to run command/process or shell script that can continue running in the background after you logout from a shell.

    `ssh -q -o ConnectTimeout=10 $usr1@host1 nohup /tmp/start $V1 $v2 &`;

    All is well