Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Re: Image Hosting Application

by BUU (Prior)
on Jul 20, 2004 at 22:41 UTC ( [id://376110]=note: print w/replies, xml ) Need Help??


in reply to Image Hosting Application

Since you seem to have two main concerns, efficiency and testability, let me try to adress them in turn.

First off, efficiency. As the mantra goes, premature optimization is the root of all evil. In other words even if you *think* something might be slow, benchmark it and find out! If, for some reason, a sql query and a mod_rewrite handler call is excessive over head for sending an image (for some reason, I doubt it, but maybe..), then you have several options.

My first option would be to maintain a hash in memory (Using one of the IPC::Share type modules) that all of your scripts can accesss to get current bandwidth usage. Thus removes the sql transaction time and should speed it up.

The second component of this system would be a daemon that regularly polls the in-memory hash and writes it's data to a database. Just make sure you use locking.

Only problem I see here is say, the daemon checks every hour for new data, and your machine crashes 59 minutes after the daemon last run, you could lose up to 59 minutes worth of data transfer. I don't really see this as a problem personally, since from the type of application you describe an extra dozen or hundred "free" requests shouldn't hurt you or anyone else, and may make a decent "bonus" since the machine did crash.

(On a slightly different tack however, have you considered, instead of bandwidth limits and so fort, only allowing the images to be shown on your server and then just not worrying about bandwidth and so forth?)

Your second concern, testability, I have two thoughts. The first thought is limited to the situation at hand, in which testing should be simple. Implement whatever you want, add some pictures and some limits, then just have AB or something similar generate a few thousand requests for a gif. Theres your testing. Not hard.

As for testing in general, the idea that came to mind was to take a snapshot of the data for your app at a particular time, then start recordding http requests (perhaps by reading them from the log file, whatever). Once you had say, a days worth of requests, you could use the snapshot-data you stored previously to recreate the conditions on your test machine, then play back the http requests. You could even play them back at a faster rate, say a day in 3 hours or something to generate higher load.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://376110]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others goofing around in the Monastery: (3)
As of 2024-04-20 01:15 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found