Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Re: Listing Active CGI::Sessions

by davido (Cardinal)
on Sep 05, 2004 at 15:35 UTC ( [id://388621]=note: print w/replies, xml ) Need Help??


in reply to Listing Active CGI::Sessions

Just keep track of session ID's. You can do this by storing session ID and its accompanying session info in your DB. Any time a session is running, just re-check to see if any of the times in the DB have expired. Then it's just a matter of counting up how many haven't expired yet.

For session key and expiration information, have a look at the POD for CGI::Session.


Dave

Replies are listed 'Best First'.
Re^2: Listing Active CGI::Sessions
by Anonymous Monk on Sep 05, 2004 at 16:30 UTC
    I have mysql tracking all the session info. what i'm trying to accomplish is having administrators and users see how many current live(unexpired) sessions there are (i.e. Current Member Online: 334). I've checked the Perl Docs for CGI::Session and it doesn't seem to explain how to go about doing this.

    The administrator wants this feature in order to cancel/delete sessions of abusive members.

    Thanks for the reply.
      If you're looking for something like a count of total memebers signed in or a list of all memebers signed in then you could do what we did.

      That is just have each page request store which user the page was requested for and the current time in a database table. Make the UserID field unique and it shouldn't grow out of hand. When looking to get stats you can just SELECT COUNT(*) FROM ActiveSessions or SELECT UserID FROM ActiveSessions. I would suggest clearing old records from the table with a 5 minute cron or a pseudo-random system (maybe 1 in 500 page loads).

      Another thing you can do is actually have summary data that is generated either a) every say 5 minutes or b) whenever that data is changed (ie someone logged in or out). This plan works better on things that don't change very often or where you have a high number of records to deal with.

      Note that this method can get database intensive when you get a lot of site traffic. For the above linked site the sessions database does not run on the same machines that run the site's actual content database.

      PS: Yes I wrote machines as in plural. 60 Gigs of of web files (ie not including protocol overhead) is not an odd weekday. I think those databases get a fair bit of work for a website. - frink

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://388621]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others contemplating the Monastery: (2)
As of 2024-04-24 17:09 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found