Beefy Boxes and Bandwidth Generously Provided by pair Networks
Don't ask to ask, just ask
 
PerlMonks  

RE: RE: RE: Distributed network monitor

by perlcgi (Hermit)
on May 06, 2000 at 21:50 UTC ( [id://10479]=note: print w/replies, xml ) Need Help??


in reply to RE: RE: Distributed network monitor
in thread Distributed network monitor

You might be able to achieve what you want with MRTG. Alternatively, I wrote some scripts a while back to produce this. A cronjob runs an LWP download from each site being measured on the hour, every hour. It simply measures download time from each site, sticks results in a flat file (not even a database). A cgi script calculates site performance averages, and displays a simple graph. So it's pretty lame, does'nt account properly for time-outs, but might get you started. Let me know if you want it.
  • Comment on RE: RE: RE: Distributed network monitor

Replies are listed 'Best First'.
RE: RE: RE: RE: Distributed network monitor
by Novician (Novice) on May 08, 2000 at 13:54 UTC
    Yep...
    MRTG could be a start. At least i know the ISP Singnet in
    Singapore is using it. I am also using it now for my attachment
    place.
    I was asked to link it with html and perl script to make it
    easier to configure thought the internal network without having to utilize
    the VI editor which any windows users might not know how to use.
    Running the crontab will allow you to run the mrtg programme
    when you specify it to run, other than that, you can also
    use the daemon in mrtg together with interval rate in MRTG
    to do the same effect as a crontab will do. OR, a script to edit the crontab will do just fine.
RE: RE: RE: RE: Distributed network monitor
by ChuckularOne (Prior) on May 08, 2000 at 21:29 UTC
    Please post it, if you don't mind. I'm looking for something like that to prove to a friend that it's his dial-up, not my web-site! :-) -Chuck
      OK, here's the bit that does the measuring. I'm not showing the cgi-script that does the display - I'm too embarassed and don't have the time to make it suitable for public consumption. I do have a little bit of hubris, (and laziness in spades) :-) But if you *really* want it leave your email. perlcgi.
      #!/usr/local/bin/perl -w # Written, as quick 'n dirty hack - so here it is warts and all. # This just retrieves a hash of sites and times how long for each page # to download. Writes timing for each site to tab delimited file/stdou +t. use strict; use LWP::Simple; # I know, better to use UserAgent use Time::HiRes qw(gettimeofday); # Accuracy overkill, maybe? my $thisrun=localtime(time); # When the data was collected #Amend the next line as required or just leave it out #open(OPFILE, ">>/var/logs/whatever") || die "Could not open/var/logs/ +whatever:$!"; # Hash of sites to be timed my %url = (pubmed =>"http://www.ncbi.nlm.nih.gov", proquest =>"http://proquest.umi.com", JSTOR => "http://www.jstor.ac.uk/jstor/", AMAZONUK => "http://www.amazon.co.uk/", AMAZONUS => "http://www.amazon.com/", SPRR => "http://sprr.library.nuigalway.ie", SCIDIR => "http://www.sciencedirect.com/" ); sub process_url { my $url = shift; my $now_time = gettimeofday; my $page = get($url); my $elapsed_time = gettimeofday-$now_time; # print "Site:$url\tTime Taken:$elapsed_time\t Run on $thisrun\n"; # print OPFILE "$url\t$elapsed_time \t$thisrun\n"; } foreach my $key (sort keys %url) { &process_url($url{$key}) } # That's it!

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://10479]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others admiring the Monastery: (5)
As of 2024-04-19 12:08 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found