http://qs321.pair.com?node_id=11135570

nysus has asked for the wisdom of the Perl Monks concerning the following question:

I've got a data file that will be read/written to by several processes running at the same time. I want to be sure the data in the file is opened, manipulated and saved by one process at at time to ensure the processes don't stomp on each other's work. From what I've read, using a semaphore file is the way to go. I am using Storable to read/write the data. I came up with these two helper functions for store and retrieve which I'm hoping will do the trick:

use Storable; use Fcntl qw(:flock) sub _store { my $data = shift; my $data_file = 'data/plugin_portfolio'; store $data, $data_file; close LOCK; } sub _retrieve { my $data_file = 'data/plugin_portfolio'; return 0 if !-f $data_file; my $lock = $data_file . '.lck'; open (LOCK, "> $lock") or die "Can't open lock file"; flock(LOCK, LOCK_EX); retrieve $data_file; }

I believe the file handles are global so I don't think it's a problem having LOCK in two different subroutines. But I'm worried that there might be something I'm missing that will cause me to lose data. Or maybe there's a simpler way...

$PM = "Perl Monk's";
$MCF = "Most Clueless Friar Abbot Bishop Pontiff Deacon Curate Priest Vicar";
$nysus = $PM . ' ' . $MCF;
Click here if you love Perl Monks