Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

File Locking in CGI programs

by enemyofthestate (Monk)
on Jul 06, 2005 at 00:25 UTC ( [id://472643]=perlquestion: print w/replies, xml ) Need Help??

enemyofthestate has asked for the wisdom of the Perl Monks concerning the following question:

I have a logging problem that is driving me buggy. I have a CGI program that receives XML from a customer, processes it, enters some of the information in an Oracle database, and returns an XML document to the customer. Each step along the way is logged to a buffer and written all at once as the program ends. Her eis how I write to the log:

# open for append open FILE, ">>$g_db_logfile"; # wait for lock flock (FILE,LOCK_EX); # once lock is granted seek to end, JIC seek (FILE,0,2); # print the string print FILE "$logstr\n"; # unlock file flock (FILE,LOCK_UN); # close file close FILE;

The problem is that when several processes are running the log may not be written for all of them. I know the orders are coming in -- I can see them in the DB -- but I don't always get the expected log entry. Any ideas what I am doing wrong?

This particuar program is running on Perl 5.6.0.

Replies are listed 'Best First'.
Re: File Locking in CGI programs
by tmoertel (Chaplain) on Jul 06, 2005 at 04:19 UTC
    If your software runs on Linux, you do not need the locking code, nor the seeking code. Just open the file in append mode and commit each log entry as a single write. Unix file semantics guarantee that the write will append to the file atomically. (For more information see Re^3: Looking for a simple multiprocess-enabled logging module, in which I explain more and invoke the Single Unix Specification.)

    See also "All I want to do is append a small amount of text to the end of a file. Do I still have to use locking?" from perlfaq5.

    Cheers,
    Tom

Re: File Locking in CGI programs
by perrin (Chancellor) on Jul 06, 2005 at 01:04 UTC
    For one thing, you aren't checking for failures on commands like open(). Those may not succeed. You also don't need to actually flock the file if you open for append and are on a sane POSIX system. The seek will be done for you, and the writes will be atomic.
      I do check for a failure on open(); I just left that out as not relevant to the question. The code is running on a linux system (FC3) so is it sane enough so that I don't need to do the file locking? The string I write out may have embedded carriage returns if that makes a difference.
        You can check the open() man page on your system, but I believe it's safe if the data fits in one block, which typically means 4K or less on Linux.

        Whether you check return values is always relevant IMHO. You should at least include it to let us know you thought about and are doing it so as not cause someone to chase after a potential red herring in your code.

        See my earlier message in this thread. Were you checking the return value of close()? It's highly relevant.

Re: File Locking in CGI programs
by chas (Priest) on Jul 06, 2005 at 01:10 UTC
    You probably have done this, but just to be sure, have you imported the lock constants with
    use Fcntl ':flock';
    ? (I seem to remember using
    use Fcntl ':DEFAULT';
    as well but looking at the module docs now I'm not sure why that's necessary...)
    chas
Re: File Locking in CGI programs
by duff (Parson) on Jul 06, 2005 at 03:45 UTC

    Don't explicitly unlock the file before closing it. In earlier perls (and I believe 5.6.0 is early enough) buffers were not flushed before locking/unlocking which could cause some data loss problems. Newer perls will flush before either a lock or an unlock.

    Oh, and closing the file will automatically unlock it and flush it, but check the return value of close to be sure that it succeeded.

Re: File Locking in CGI programs
by Tanktalus (Canon) on Jul 06, 2005 at 02:58 UTC

    The very first time I used a database in a web app, it was to avoid all file locking problems. So, of course, with that background, this just screams to be a database problem.

    Rather than logging to a file, insert into a logging table in your database. You already have a connection opened to it, just use it.

    Note also that, for the purpose of logging, syslog is also a database. It solves all these same problems - you're communicating with a single server that interfaces to the file(s) transparently. (If that server is multi-process or multi-threaded, or even multi-machine, that can be solved by whoever writes the server, and you don't need to worry about it.)

    I actually worry about very few file locking scenarios now ;-)

      Syslog is usually considered to be slower and less reliable than just appending to a local file yourself. It can drop messages sometimes.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://472643]
Approved by GrandFather
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others lurking in the Monastery: (3)
As of 2024-04-24 05:43 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found