Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Multiple processes, one log file?

by splinky (Hermit)
on May 19, 2005 at 19:08 UTC ( [id://458741]=perlquestion: print w/replies, xml ) Need Help??

splinky has asked for the wisdom of the Perl Monks concerning the following question:

My fellow monks...

My system is comprised of multiple Perl processes under Linux. If it came down to it, I could have each process log to its own file and then merge them together after the fact, but I would really like for all these processes to log to the same file, with all the logs interleaved and ordered by time.

I've tried a few simple experiments with Log::Log4perl using Log::Log4perl::Appender::File, and while it appears to work, I haven't seen anything in the documentation to assure me that it's meant to work.

Is there a simple solution?

Thanks.

Replies are listed 'Best First'.
Re: Multiple processes, one log file?
by davidrw (Prior) on May 19, 2005 at 19:20 UTC
Re: Multiple processes, one log file?
by sgifford (Prior) on May 20, 2005 at 01:30 UTC

    I implemented something similar (though not using log4perl) using flock. You just get an exclusive lock on the logfile before writing, then write, then release the lock.

    Before that, my experience has been that if you write less than the system block size or MAX_PIPE (4K on many systems), you can get away without locking. I'm not sure if that's guaranteed; I remember reading it once, but I've been looking for the reference for two years without finding it, so I can't say for sure. :)

      In fact it is not guaranteed. We reinvented our log wheel and 95% of log activity is on the same log file, and we find it very useful.

      From perl docs, various web examples, this merlyn's webcolumn (written when I didn't even know what perl was) and extensive™ testing, I learnt that using flock() with LOCK_EX flag is the only way to have consistent updates/writes by many processes onto a single log file (or page-counters, for that matter), though this method is not reliable across platforms.

      Not using flock resulted in mangled log files, with different processes log buffers being mixed and truncated at random.

Re: Multiple processes, one log file?
by astroboy (Chaplain) on May 20, 2005 at 02:27 UTC
    Do your processes share the same parent or are you in a persistent environment like mod_perl? The issues are covered in the Log4perl FAQ. Anyway, have a look at Log::Log4perl::Appender::Synchronized
Re: Multiple processes, one log file?
by Vautrin (Hermit) on May 19, 2005 at 19:42 UTC
    Just as a thought, what about creating a daemon that would run in the background and listen on a socket? It would only need one thread and it could just log as information came into the hopper...
    .
      You could call it "syslog". :-)
        It's not fun if you don't roll it yourself. :-D
        .

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://458741]
Approved by davidrw
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others having an uproarious good time at the Monastery: (2)
As of 2024-04-26 01:31 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found