in reply to Calling perl systems from other systems, e.g. R

Depending on the complexity of the interaction between R and Perl that you desire, R packages can include Perl scripts (subject to availability of Perl on the target machine, of course, e.g. gtools::read.xml, gtools::installXLSXsupport), or R itself can work as a server accepting commands from clients (e.g. Rserve).

If you take the HTTP server way, please be careful about accepting arbitrary code to evaluate over the local socket. Far too many modern applications start HTTP servers on localhost, only for someone else to discover that there's an unprotected endpoint that can be trivially accessed by an evil website by sending a few thousand requests of the form http://localhost:${port}/eval?system("pwn_the_machine"). Ideally, I would suggest UNIX domain sockets and Windows named pipes (which are easily restricted to the current user and cannot be accessed by rogue JavaScript in a browser), but getting R to speak those (especially the latter) can be hard.

Replies are listed 'Best First'.
Re^2: Calling perl systems from other systems, e.g. R
by swl (Parson) on May 01, 2022 at 22:42 UTC

    Many thanks for this.

    I need to look more deeply but gtools seems to use system calls. My fallback is to run scripts using system calls, but ideally there would be persistence so the perl side of things can run more analyses without having to restart scripts and reload data.

    It also requires the user have a working perl. My system has a large dependency tree that includes many XS libs so can't be fatpacked. Compiling using RTools and installing using a project level local::lib is not impossible but does get complex and hard to control across user environments. ...Although now I read more I see it can compile such libs.

    Thanks also for the sockets/pipes advice. I knew about arbitrary code but had never considered external websites hitting local services.