in reply to Preventing multiple instances

Here's some sample code, using Perl's flock function, that I've used for many years on both Unix and Windows, to ensure only one copy of a script is running at a time (update: run on local file systems only, not NFS).

use strict; use warnings; use Fcntl ':flock'; # import LOCK_* constants warn "program starts\n"; my $somelockfile = 'choosealockfilename'; # adjust according to you +r needs open(my $fhlock, '>', $somelockfile) or die "Error: open '$somelockfil +e': $!"; warn "before flock: filename='$somelockfile'\n"; # Note: process will block at this point until the lock is acquired. flock($fhlock, LOCK_EX) or die "Error: flock '$somelockfile': $!"; # Lock is now held until $fhlock is closed. # Note that even if this program crashes or is killed, $fhlock will # be closed by the OS and the lock released. # ... warn "Got lock, sleeping for 10 secs...\n"; sleep 10; warn "woke up\n"; # Release the lock simply by closing the file handle. close $fhlock or die "Error: close '$somelockfile': $!"; warn "lock released: sleeping for 5 secs...\n"; sleep 5; warn "program ends\n";
You can easily test its behaviour by running the little test program above in two different terminal sessions (and either waiting for the sleep to end or manually killing one of the processes).

Apart from providing portable locking across Unix and Windows, flock has long been a favourite of Perl poets, as beautifully shown by pjf in this immortal line:

join flock($other,@chickens) if split /from others/;
from his classic poem my @chickens (by the way, pjf is one of the few monks I've met in real life; in addition to running chickens in his backyard, he has a keen interest in picking and eating unusual and delicious wild plants you won't find in any supermarket ... not for the faint-hearted, you need a keen eye and expert knowledge to avoid being poisoned).

References Added Later