http://qs321.pair.com?node_id=229577


in reply to After the OS upgrade the site responsiveness

I don't disapprove of market research, surveying what users thing of the site's performance. However, I think that this is not as useful as an objective benchmark. A big drawback is that people's expectations are variable and inconsistent. Do you expect people to remember accurately how the performance was before the upgrade?

It is also important to distinguish the different scenarios: "Perlmonks is generally slow", "Perlmonks is appallingly slow today" "Function X (e.g. logging in) is slow".

Presumably, with a minimum impact to performance, the Everything code could cut a log entry of its own when the page completes, giving the request type, start and finish times for each HTTP request. At the end of the day, a script can munge the log files and produce statistics about response times.

I presume that the bottleneck is in MySQL database access. In that case, turning on Devel::DProf for some sample periods could provide useful insights, without making the site unusable.

Also, Pair can presumably provide statistics on busyness, in terms of hit rates and number of accesses per interval. This would give an objective measure of whether the site is generally being used more heavily.

My $0.02 --rW

  • Comment on Re: After the OS upgrade the site responsiveness