Sounds to me like you know your stuff.
And that your stuff is indeed not web stuff :-)
Separate the part where you take the input, generate the output, and present it to the end user.
What is the longest that this can take to process?
Limit the input, that is, check it. Test out what happens when you are feed bogus info.
If your program is cpu intensive (likely, with graphics, possibly multiple users) what happens if each run takes 60 seconds, and 4 users keep hitting refresh on their browsers? Will it crash your server?
If the operation is intensive, and could take up to a minute, consider using sessions, and storing something like 'has user queued a call to genereate output yet?' and 'how old is that call to generate output?'
Maybe if the user refreshes, the session should be checked to see if a call no older then x is present from this user.
I would personally separate all this into something that works without any web interface- first. Once you test that out to see that it doesn't grind the server to a halt- you can worry about the interface.
I would use CGI::Application.
But this *may* be overkill.
If you all you are doing is
- initiate session (CGI::Session)
- show input form
- accept input, match against session to know they are not hitting refresh a million times
- pass the parameters to maybe another script or using fork(), record in session this user has queued a call for output
- and perhaps every page reload, check if a user call for output is ready.
Then again maybe it's not overkill. If you do use CGI::Application, make *sure* you look over the plugins, for example, CGI::Session has a CGI::Application::Plugin::Session counterpart which is candy to use.
Maybe you could even email the results to them instead. So your system would have a queue whith entries are identified by email address.
I think you should really see this as two problems- a) what your program does (generate a graph)- b) and balancing load and requests from blowing up everything.
|