Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

File locking

by nysus (Parson)
on May 13, 2001 at 10:23 UTC ( [id://80004]=perlquestion: print w/replies, xml ) Need Help??

nysus has asked for the wisdom of the Perl Monks concerning the following question:

Looking for some basic info on file locking and when to use it. From what I understand, file locking prevents a file from being accessed by two or more processes at the same time.

I'm a bit ignorant on processes so I need some clarification. Let's say I've got a .cgi file on my server that asks for a password and checks it against the password in a text file. The password file is shared by a couple of different scripts.

So is file locking needed in this situation? Or is it impossible for the password file to be accessed simultaneouly because only one .cgi script can be run at a time? Or am I totally f***** up on all this?

$PM = "Perl Monk's";
$MCF = "Most Clueless Friar";
$nysus = $PM . $MCF;

Replies are listed 'Best First'.
Re: File locking
by sierrathedog04 (Hermit) on May 13, 2001 at 15:17 UTC
    One point to remember is that Perl only implements advisory file locking. That means that if you use flock but some other Perl script doesn't, then that other Perl script will be able to access the file on which you used flock.

    The Camel book makes this distinction at length. Flock is like a red light at a traffic interection. If you and another driver both obey red lights, then only one of you can be in the center of the intersection at once. But if that other driver doesn't obey red lights then he can try and use the intersection at the same time as you, and a collision results.

Re: File locking
by physi (Friar) on May 13, 2001 at 11:30 UTC
    A short word to file locking

    File locking works only for writing files. If you want to implement it for reading, you have to open the file as read/writeable. Then all other processes, which opens the file in read/write mode too, will wait until the first process unlocks the file.
    For just reading the file, locking is not nessessary and wouldn't even work.

    ----------------------------------- --the good, the bad and the physi-- -----------------------------------
      I don't think this is correct in general. Here's an example that demonstrates locking a file that has been opened for reading:
      #!/usr/local/bin/perl -w use strict; use Fcntl qw/:flock/; my $file = 'flock.test'; -e $file or open(FILE, ">$file") or die "Can't create $file: $!"; if (fork) { open(FILE, "<$file") or die "Can't open $file for reading in paren +t: $!"; flock(FILE, LOCK_EX) or die "Can't flock $file in parent: $!"; wait; } else { sleep 1; open(FILE, "<$file") or die "Can't open $file for reading in child +: $!"; flock(FILE, LOCK_EX|LOCK_NB) or die "Can't flock $file in child: $ +!"; }
      And here's the output it produces when run on my system: Can't flock flock.test in child: Resource temporarily unavailable at flock.pl line 18. However, this behavior is probably system dependent, so physi and I are likely both right.
      For just reading the file, locking is not nessessary and wouldn't even work.

      This is only true if the file is never going to be modified programmatically. If it is, then it needs to be flock'ed in shared mode (LOCK_SH) unless you don't care about the fact that the file might change while you're trying to read it. Usually you do care, 'cos if you don't then I don't see why you are bothering to lock it at all. ;)

Re: File locking
by mirod (Canon) on May 13, 2001 at 11:47 UTC

    Looking for info on file locking? Why not try the Search field right at the top of the page? It would have given you a tutorial on File locking, which is not perfect but KM's follow-ups are pretty neat,. I like (and use) the one he describes in RE: File Locking.

    Using "locking" on the Super Search would also have retrieved a bunch of interesting nodes, for example How do I lock a file?

      My question was more about whether a new process gets spawned every time a .cgi request is made and how this would affect the need to lock files. I don't know much about how forking and processes work.

      $PM = "Perl Monk's";
      $MCF = "Most Clueless Friar";
      $nysus = $PM . $MCF;

        My question was more about whether a new process gets spawned every time a .cgi request is made and how this would affect the need to lock files.

        Yes, a new process is spawned every time a CGI request is made. There is no direct connection between this fact and the need, if any, to lock files. File locking is a design decision that you'd make in order to try to protect file data if there is a strong possibility that more than one process might update a file simultaneously. The fact that the process might be called via CGI or not doesn't really enter into it.

Re: File locking
by Eureka_sg (Monk) on May 13, 2001 at 10:38 UTC

    Hi, a separate process is created each time a .cgi script is being requested by the client browser.(If you have mod-perl installed,then I'm not sure)
    So, it is possible to have the password file accessed simultaneously. However, if all you do is just reading from the file and verifying your password, then it's safe. If there can be a process which is modifying your passwords at the same time, then you should ensure proper locking of file.

    UPDATE: Kindly refrain from ++ my nodes, else Anonymous Monk is going to think that nysus is me or I'm out to cheat xp. : P

A reply falls below the community's threshold of quality. You may see it by logging in.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://80004]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others pondering the Monastery: (4)
As of 2024-04-18 22:46 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found