|Perl: the Markov chain saw|
I'm writing a database-backend web application for my office to allow users to list, view details, and edit a bunch clients data.
Normally, I would write some CGI / perl-mason scripts like this:
Those programs would open the database, query, and the return the results formatted as html. Simple stuff.
Since I expect that users will be viewing data (read-only) much more often than editing data, I am considering instead writing a build_html() script that will go through the list of every contact in the database and write a few different static .html files for that contact. I will call this script after anyone uses any of the edit.cgi scripts.
The advantage is that I will avoid lots and lots of redundant database calls. The disadvantage is that I will have a directory with about 5 - 10 thousand html files. According to df -i, the partition I'll be working on has about 3 million inodes available, so I don't think I'll be doing much damage there, but I wanted to get the wisdom of the monks before I begin.
Is there a third way? All comments are welcomed.