http://qs321.pair.com?node_id=592452

xaprb has asked for the wisdom of the Perl Monks concerning the following question:

I'm working on making innotop (a MySQL monitor) capable of monitoring multiple MySQL servers simultaneously. One of the bottlenecks is waiting for queries to come back from the servers; it's fine when there's just one server, but if the program is to refresh itself every second and there are many servers, it's a problem. I'd like to fire the queries all at once. There are other processes (parsers) I'd like to do this with too, so the parsers can run on the first query results while the others are still being generated, but that's really kinda optional; if I can get the queries to be asynchronous, I'll be happy enough with that.

My first thought was forking child processes, but (correct me if I'm wrong, as I'm new to this) the child processes can't alter data in the parent process.

Then I looked around the web. Other solutions came to mind: shared memory, opening a pipe from the child to the parent so the child can serialize the result and the parent can re-hydrate it, using Storable and having the child write to a file and the parent read from it, etc etc. These all strike me as horrific kludges that may not be safe or portable. It also looks like Perl's threads are not portable, according to what I've read online.

There are some CPAN modules that do things like this, but I'm picky: I don't want people to have to install a bunch of arcane modules to run this program (one or two is okay, but not an entire bundle). And it needs to be fast, stable and portable to at least Linux, FreeBSD and Windows. Stop me if I'm asking too much.

I'm willing to go down any of the above-mentioned roads, but I'd love to hear your thoughts on which will be the most fruitful. I'd hate to spend time on something that's not going to work out well in the end.

Your guidance is gratefully received -- thanks for reading this far!