Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Timing Web Page Requests

by insensate (Hermit)
on Mar 11, 2002 at 17:48 UTC ( [id://150924]=perlquestion: print w/replies, xml ) Need Help??

insensate has asked for the wisdom of the Perl Monks concerning the following question:

Greetings, I have been asked to write a script that will calculate the time a webpage takes to load...I interpreted this requirement as the time between a http request and response for the given page. The script I've come up with is posted below. Before I start using it I wanted to subject it to scrutiny here hoping to avoid embarassment due to inaccuracy. Any direction or critism would be greatly appreciated... Thanks in advance, Jason
#/usr/bin/perl -w use LWP::UserAgent; use Crypt::SSLeay; use Time::HiRes qw(gettimeofday); use strict; print "Enter URL to Time GET request for... "; my $host=<STDIN>; chomp $host; my $before=gettimeofday; my $ua = LWP::UserAgent->new(); my $request = HTTP::Request->new('GET', "$host"); my $response = $ua->request($request); my $elapsed=gettimeofday-$before; print "ERROR: Bad URL\n" if($response->is_error); my @content=split/\n/,$response->content; print "Request took $elapsed seconds.\n"; my @title=grep /(?:title|TITLE)/, @content; print "TITLE LINE: @title \n" unless($response->is_error);

Replies are listed 'Best First'.
Re: Timing Web Page Requests
by dws (Chancellor) on Mar 11, 2002 at 18:25 UTC
    If the page you're loading is trivial (i.e., has no external style sheets or java files, no frames, or no images), your script will more or less work, though you're better off using time() instead of gettimeofday().

    Most pages of interest aren't trivial, which means that after loading the base page, you need to parse it to find style sheets, frames, images, etc., so that you can load the additional components. This isn't a trivial exercise, particularly if the images are getting loaded by way of JavaScript (say, for rollovers), or via a style sheet. Simulating a browser can take a lot of work; you can sink a lot time into parsing JavaScript and CSS.

    To further complicate things, there's caching and session "keep-alive" behavior to emulate.

    And if you want to get a really accurate read on how long it takes to read a page, you have to pull out the big guns, and analyze packet traces. Probably more than you need, but it's insightful to do at least once.

    So, how accurate a timing do you need?


    Update: And if you're counting bytes, don't forget to count both the HTTP header and the TCP/IP header. It's surprisingly expensive, in terms of packet and data exchanged, to verify that a .GIF you have cached locally hasn't changed.

Re: Timing Web Page Requests
by Hero Zzyzzx (Curate) on Mar 11, 2002 at 17:59 UTC

    This isn't a perl solution, but I still think it'd be useful. If you're using the ubiquitous apache server, you should probably check out the lovely apache benchmark tool, 'ab'. It will give you FAR better results than your script will, because it allows you to set multiple requests, concurrency and cookie values. It also gives nicely formatted, readable results.

    Timing one request isn't going to be that useful anyway, you're going to get varying results depending on a bunch of factors. Use ab to do multiple requests at different times and you'll get much more useful results.

    -Any sufficiently advanced technology is
    indistinguishable from doubletalk.

Re: Timing Web Page Requests
by guha (Priest) on Mar 11, 2002 at 19:43 UTC

    You could also check out this link to the mighty merlyns WebReview columns where he addresses the download timing problem.

    It does not take the time to render and parse scripts into account as dws correctly notes. However you may find it as interesting as I did.

    ---
    I would like to change the world but God won't let me have the source code.
Re: Timing Web Page Requests
by traveler (Parson) on Mar 11, 2002 at 19:02 UTC
    I'd like to echo the comments of dws. Timing complex loads is a complex problem. Another issue is that when many people are interested in the time required to "load a page" they mean the time from the request until all the graphics have finished rendering.

    Some browsers render pages faster than others. In my experience Opera is pretty fast on most pages and Netscape is very slow on some pages. IE can be slow on some pages, too. Measuring the time to render is somewhat more difficult than just timing data transfer.

    You should also consider the time required to load and display banners from third-party advertizers or ad services if you use that.

    You may also consider running your test a few times in a row or over a longer time span so you are not timing Internet delays if the page is not on your local server.

    If the desired end result is to get the page sped up a bit, consider netmechanic or web site garage, or another service that can help with that.

    HTH, --traveler

Re: Timing Web Page Requests
by webadept (Pilgrim) on Mar 11, 2002 at 20:27 UTC
    I'd like to echo everything the others have said and I noiticd you have the SSL module in there, that's going to throw you off as well. Its been my experience that different browsers have different speeds in rendering the secure pages, and all of them are slower than a direct perl grab like you have here.

    Not a lot of good news in here eh? :-P

    Personally I use the Apache program mentioned above. Its configurable, and it does a good "over all" job. If you need "browser effect" I'm still using a stop watch there. I'm really hoping some guru will stop by with a script method for you.

    Glenn H.
Re: Timing Web Page Requests
by belg4mit (Prior) on Mar 11, 2002 at 20:38 UTC
    I interpreted this requirement as the time between a http request and response for the given page.

    In that context this seems fine, many seem to have missed that this was your definition of fetching a page. However if you only seek to measure this response time a HEAD request would be kinder.

    --
    perl -pe "s/\b;([st])/'\1/mg"

Re: Timing Web Page Requests
by pizza_milkshake (Monk) on Mar 12, 2002 at 01:57 UTC
    #!/usr/bin/perl -w use strict; my ($t, $a); # why do all the hard work yourself? chdir "/tmp"; $t = time; $a=`wget -r --level=1 --span-hosts --follow-tags=img,object,applet,lin +k,script http://www.yahoo.com/`; print "Took ".(time-$t)." seconds\n";
    perl -e'@a=split//," \n/)\"={\"",$b="00002731000223155264555102401";print $a$_for split//,$b'
      the more the merrier... this is an improvement
      #!/usr/bin/perl -w use strict; my ($t, $str)=(time,"wget -r --level=1 --quiet --span-hosts --follow-t +ags=img,object,applet,link,script http://www.yahoo.com/"); chdir "/tmp"; `$str` for 1..50; print "Took ".((time-$t)/50)." seconds\n";
      perl -e'@a=split//," \n/)\"={\"",$b="00002731000223155264555102401";print $a[$_] for split//,$b'
Re: Timing Web Page Requests
by bastard (Hermit) on Mar 12, 2002 at 15:25 UTC
    In the past i've found the following usefull:

    Benchmark or Benchmark::Timer - made for benchmarking sections of code (i've only used the latter)

    HTTP::WebTest - a web testing module set. it also looks like it does request time (no clue on how accurate)

    To get a better picture of the load time. Load the page multiple times and take the average of the sequence. (This technique can be extended into load testing as well. Just add threaded http requests and run them all at once.)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://150924]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others scrutinizing the Monastery: (4)
As of 2024-04-25 19:38 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found