Pathologically Eclectic Rubbish Lister | |
PerlMonks |
Re: Central logging methods and thoughtsby Perl Mouse (Chaplain) |
on Oct 10, 2005 at 09:52 UTC ( [id://498758]=note: print w/replies, xml ) | Need Help?? |
Large companies tend to do their logging/monitoring centrally. They have thousands of devices (computers, disk arrays, switches, routers, tape robots, etc.) and only a handful of staff to monitor. Centralization is a necessity. There are commercial products like HP Openview and Tivoli.
Central logging as another benefit: central logging implies remote logging. Remote logging means that if a machine goes haywire (or, in a hostile environment, get corrupted), it's less likely logs get wiped out. In many places I've worked, be it as an employee or a contractor, some sort of central logging was done. From simple things as FTP'ing local logs in a nightly batch job, to thousands of machines monitored/logged with Tivoli, displaying the current status of the environment on a monitor 3 metres wide. Personally, I'd go for central logs. If only because that means I know where to go digging for possible log file entries. But then, I look at the problem more from a sysadmin angle than a programmers' or end-users'.
Perl --((8:>*
In Section
Meditations
|
|