Some of you may remember my crazy problems with CGI as discussed in the chatterbox yesterday. I was having problems using any HTTP GET within my CGI scipt, this problem still remains a mystery. Rather than continuing to bang my head on the wall, I have decided to pursue a different architecture.
The gist of the problem is: I have a list of URLs, I wish to display the HTML associated with each of those urls, allowing the user to make some judgments on each page via some radio buttons. Since LWP etc. dont seem to want to work for me, I have pre-fetched each of the pages, storing each as a unique, numbered file stored in a separate directory. here is a toy example code that exhibits the problems I am having
#!/usr/local/bin/perl -w
use strict;
use CGI qw(:standard);
use CGI::Carp qw(warningsToBrowser fatalsToBrowser carpout);
$| = 1;
print header(-type => 'text/html'), start_html();
my $path = '/var/www/cgi-bin/samples';
print "hi";
undef $/;
opendir DIR, $path or die "cant open dir $!";
my @parts = sort(readdir(DIR));
foreach my $x (@parts)
{
open FILE, $path."/".$x or die "cant open $x $!";
my $data = '';
$data = <FILE>;
print $data;
close FILE;
system("date");
sleep(5);
}
the error message I get is as follows:
Software error:
cant open dir Permission denied at /var/www/cgi-bin/test3.cgi line 15.
I have given a+rw permission to all the files associated, and a+rwx permission to the directory "samples" and all parent directories in the file system tree. I'm clueless. help me monks, you're my only hope.