Totally great technique. Except that if we can't get
vroom to do it here, all the real perlmonks.org links will still clutter any search engine result listing. :)
How hard would it be to simply redirect user agents that don't look like spiders/bots to the real site? I notice in my logs that the well behaved spiders ask for robots.txt first, so based on that you could allow only those agents that ask for robots.txt first to access these clean, better-for-indexing pages.
Just out of curiosity... does your CGI script simply use an LWP request to perlmonks.org to create the content on the fly... I assume it must since to do otherwise would be a constant update job. Although I also assume for efficiency it would cache any pages it has processed at least once. Something like this would also be a great start on that CD-ROM version of the site. ;)