A little more context would be useful, since cronjobs sound like the more appropriate solution here. Is there a reason you don't want to use them? Any code like this will put unnecessary load on your web server, since many redundant checks will be done. A cronjob would run the needed code exactly as often as would be needed. Just be sure you're using the right tool.
In any case, I'll assume for now that you have a good reason. Here's some code that would do what you want:
#!/usr/bin/perl -w
use strict;
use CGI::Carp;
my $dir = "/tmp/wwwtrash";
opendir DIR, "$dir" or die "Couldn't open directory $dir: $!";
my @files = grep { (-f "$dir/$_") && (-M "$dir/$_" > 1) } readdir(DIR)
+;
closedir DIR;
unlink @files or die "Couldn't unlink files in $dir: $!";
# rest of your script
Update:
As lindex pointed out, dying is bad CGI manners since it tends to put nothing useful in the logs or the user's browser. Add use CGI::Carp; after the strict pragma (done above) and eveything should be taken care of.