Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

[Solved]Easiest way to protect process from duplication.

by kazak (Beadle)
on Jan 20, 2012 at 17:03 UTC ( [id://948994]=perlquestion: print w/replies, xml ) Need Help??

kazak has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks, is there any possibility to make script to check if his copy still running, and exit if previous copy is still running. I know about Proc::PID but may be there is any easier way, something that is as simple as a brick and can be included in script itself. Somthing like:

Script triggered -> script checks if his copy still running -> Script starts if none of his copies were found, or dies if copy still running.

  • Comment on [Solved]Easiest way to protect process from duplication.

Replies are listed 'Best First'.
Re: Easiest way to protect process from duplication.
by cguevara (Vicar) on Jan 20, 2012 at 18:07 UTC
      Great tip! Proc::PID::File has the slight disadvantage, that it does not ship with the standard -- at least not where I looked -- so it requires managing CPAN installs.
Re: Easiest way to protect process from duplication.
by ww (Archbishop) on Jan 20, 2012 at 17:42 UTC
    So why not code a test right into your script (maybe in a BEGIN{...} block) using Proc::PID::File?

        After all, that's the very first example in the Synopsis!

    - - for posing a question without -- it appears -- even minimal effort on your part.

Re: Easiest way to protect process from duplication.
by JavaFan (Canon) on Jan 20, 2012 at 20:23 UTC
    use Fcntl ':flock'; open my $fh, "+<", $0 or exit; exit unless flock $0, LOCK_EX | LOCK_NB;
    Note that depending on your OS and how you call the program, a second open may fail (and then, the flock isn't necessary).
      Here's a variation that uses the __DATA__ handle.
      use Fcntl qw(LOCK_EX LOCK_NB); die "Another instance is already running" unless flock DATA, LOCK_EX|L +OCK_NB;
        That actually requires a __DATA__ (or __END__) token to be present.
        Thank you for your post, that is what I really need.
      Thanks for your attention, everyone. Program supposed to run in a background. (I mean # test.pl &) so is it changing something ?
        No, why should it?
Re: Easiest way to protect process from duplication.
by mbethke (Hermit) on Jan 20, 2012 at 17:51 UTC
    You mean Proc::PID::File? Have a look at File::Lockfile but it's not that much easier.
    use Proc::PID::File; die "Already running!" if Proc::PID::File->running();
    Doesn't get much easier than that, does it? Edit: ups, ww beat me to it :)
Re: Easiest way to protect process from duplication.
by thospel (Hermit) on Jan 20, 2012 at 20:21 UTC
    For this sort of problem my preferred solution is to simply take an exclusive non blocking lock on a file with flock. Then the operating system does the checking for me. For nice error messages I usually let the process that takes the lock write its pid in the file (or another file if I need it to work on Windows) so that a program that fails to take the lock can just read the pid from the lockfile and mention that in a warning message.

    Don't delete the lockfile when you are done by the way. That can lead to subtle races.

      Oh?   Interesting.   Can you edify us as to what that subtle race condition is?

        Suppose the sequence in the program is:
        open lock unlink exit
        The unlink comes before any unlock or close. Otherwise you get even more race scenarios (on Windows you must actually close before being able to delete) The open is an open with create (O_CREAT), otherwise unlinking makes the next program invocation fail but without exclusive (O_EXCL) otherwise we are getting into a different locking system (with even more problems).This type of open is what you get if you do a plain open($fh, ">", $file) in perl.

        Now you can get as sequence:

        process A: open (and create) process A: lock process B: open (same file so no create) process A: unlink process A: exit (implicit unlock) process B: lock (on the file A just deleted since B still has an open +handle on it) process C: open (and create a new file with the old path name) process C: lock (on the new file)
        Now process B and C are running simultaneously with locks on different files of which only one is visible in the filesystem
Re: Easiest way to protect process from duplication.
by scorpio17 (Canon) on Jan 20, 2012 at 20:18 UTC

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://948994]
Approved by Corion
Front-paged by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others about the Monastery: (4)
As of 2024-04-24 02:21 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found