The stupid question is the question not asked | |
PerlMonks |
Efficient shared memory - possible? how??by cnd (Acolyte) |
on Feb 20, 2012 at 10:47 UTC ( [id://955034]=perlquestion: print w/replies, xml ) | Need Help?? |
cnd has asked for the wisdom of the Perl Monks concerning the following question: I want to run a dozen simultaneous ford()'d perl scripts (each with it's own indiviaul processor affinity on a multi-CPU host). I want all of them to have efficient access to a large pool of mostly-static shared memory. For example - I want to be able for every script to do this:- print $shared{'hugedata'}; and this:- $shared{'totalrequests'}++; but for only 1 copy of all that to live in memory. I specifically do not want to shuffle copies of stuff around, or to go serializing/unserializing everytying all the time. Is this possible? If not - how hard do you think it would be to extract the variable-handling functions out of the perl source, and compile it all into some kind of .xs loadable module?
Back to
Seekers of Perl Wisdom
|
|