On a new server that I'm installing, Apache often gets into some infinite loop, slurping all resources. I wanted to kill the Apache children with Apache::Resource, but that module does nothing at all on this system. So until I found a real solution for this problem, I quickly created this script to kill Apaches that use more than 16 MB of memory.
#!/usr/bin/perl -wl
use strict;
use File::Find::Rule qw(find);
sub readfile {
open my $fh, shift or return '';
local $/;
return readline $fh;
}
my @files = find 'file', name => 'status', in => '/proc';
# 33 is the Apache user's uid
@files = grep readfile($_) =~ /^Uid:\s*33\b/m, @files;
@files = grep readfile($_) =~ /^VmData:\s*(\d+)/m && $1 > 16384, @file
+s;
/(\d+)/ and print("Killing $1\n") and kill 15, $1 for @files;
And in root's crontab:
* * * * * perl /root/apachekill.pl 2>/dev/null
And no, this program is not efficient at all. I should cache the readfile()s, and not use the temporary array. There probably is some F::F::R extension that looks into files that could eliminate my greps and readfiles completely. Doesn't matter much, I'll have to find a real solution anyway :)
- Yes, I reinvent wheels.
- Spam: Visit eurotraQ.