http://qs321.pair.com?node_id=1071799

gri6507 has asked for the wisdom of the Perl Monks concerning the following question:

Fellow monks,

I am writing some code where I have two processes created using fork() which communicate via sockets. I need to have only one of those processes execute functions, but the call to those functions may come from either process. So, I am trying to come up with a scheme to pass a function call, along with all arguments, from one process to the other via the socket. Here's what I have so far:

use warnings; use strict; use Socket; use IO::Handle; use Data::Dumper; socketpair(WORKER_SOCKET, GUI_SOCKET, AF_UNIX, SOCK_STREAM, PF_UNSPEC) + || die "socketpair: $!"; WORKER_SOCKET->autoflush(1); GUI_SOCKET->autoflush(1); if (my $pid = fork()) { close GUI_SOCKET; my $line = <WORKER_SOCKET>; if ($line =~ /^EXECUTE:(.*)$/) { # my $fx = "extracted function name"; # my @args = "extracted args" &{\&{$fx}}(@args); } } else { die "cannot fork: $!" unless defined $pid; close WORKER_SOCKET; # redirect the STDOUT of the WORKER script to GUI (and main Bacnet +Tool # communication) process. STDOUT->autoflush(1); my $a = 'baz'; my @b = qw(foo bar); my $c = \@b; print GUI_SOCKET "EXECUTE:doit($a, @b, $c)\n"; } sub doit { my @q = @_; print "\nDoing it " . Dumper(\@q); }
I have tried using the FreezeThaw module to serialize/deserialize the arguments on opposite sides of the socket. However, this does not work for arguments that are references, such as the 3rd argument in my example. Any ideas how else I can make such a remote function call

Replies are listed 'Best First'.
Re: Executing functions from another process
by davido (Cardinal) on Jan 23, 2014 at 19:11 UTC

    Set up a dispatch table in the process that is to execute functions. Pass function calls and parameters as JSON from one process to the other. On the recipient side, decode the JSON, which will contain a dispatch table entry key, and the parameters to pass. ...no mucking in the symbol table that way, and no obscure, one-off protocols for data exchange.


    Dave

      Conceptually I understand how this could be done, but I have two problems with this proposed solution. In my full application, the actual functions being executed are not even defined in the same script. They get pulled in via a do() call to another script. It would be a maintenance headache to maintain a dispatch table in this script. It's much easier to simply get the function name and execute it as a function ref.

      The second problem is more practical. The prototypes of functions that get called are all different. Some have no args, some have one scalar, others have multiple array refs, etc. How could I set up a generic JSON message to convey such a varying list of parameters?

        How would you set up the parameters in Perl?

        Just set up the parameters, and then print them as encoded JSON. On the other side, reverse the process.

        For example, the following structure should work fine for passing parameters.

        { "parameters": [], "function": "frobnitz" }

        Also, maybe JSON::RPC is of help.

Re: Executing functions from another process
by jethro (Monsignor) on Jan 23, 2014 at 20:13 UTC