|Don't ask to ask, just ask|
Fast processing of XML files for CGIby AcidHawk (Vicar)
|on Dec 08, 2003 at 10:21 UTC||Need Help??|
AcidHawk has asked for the wisdom of the Perl Monks concerning the following question:
I am running ASPerl 5.6.1 on Windows 2000 using Apache as the web server. I need a quick solution to display a page while our production Helpdesk has it's legs in the air. This solution may need to be used again in the future, so I would like to start with a quick win but expand this into something more a little later.
The Problem: I have an automated process that creates small xml files in several dirs. These files can number upwards of 200 files per dir and about 7 dirs.
Each XML file looks similar to the following:
Basically I need to display a table with the dir name and some of the contents of all the files in its dir. Something like:
This is a snippet of what I have at the moment which is proving FAR too slow:
I thought of putting all the data from the files into a hash so I only had to process the relevant bits when I build the web page.
What can I do to be able to read these files and put some of the detail in a web page before the web page times out or tries to refresh itself (120 Secs)?
It must be said that I am using CGI, but that cgi/html is NOT where my little experience lies..-----
Of all the things I've lost in my life, its my mind I miss the most.