http://qs321.pair.com?node_id=741545

mandog has asked for the wisdom of the Perl Monks concerning the following question:

I'm enjoying Test::WWW::Mechanize with one quirk: It fails links to pages on wikipedia

A link to the main wikipedia page is ok.

There is no javascript in the page.

The link works in firefox, konqueror and wget. Per the WWW::Mechanize faq I've struggled to find the difference between the browsers and Mech

The only slight clue is that wget -vS shows that wikipedia is behind a squid caching proxy. --But this is also the case for the front page which works.

To get on w/ my life, I'm linking to gnu.org but still would appreciate any help in resolving this mystery.

Below is minimal but complete Perl / html to reproduce the problem

#!/usr/bin/perl use strict; use warnings; use Test::WWW::Mechanize; use Test::More tests=>2; my $mech=Test::WWW::Mechanize->new( "stack_depth" => 10, 'timeout' => +60 ); $mech->get_ok('http://localhost/test.html'); # fails $mech->page_links_ok(); __DATA__ <!-- http://localhost/test.html --> <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <html> <head> <title> test </title> </head> <body> <p> <a href="http://en.wikipedia.org/wiki/Affero_General_Public_Lice +nse"> fails </a> </p> <p> <a href="http://thecsl.org">works</a> </p> </p> <a href-="http://wikipedia.org/">works</a> </p> <p> <a href="http://en.wikipedia.org/wiki/Mode_Gakuen_Cocoon_Tower +">fails</a> </p> <p> <a href="http://www.gnu.org/licenses/agpl.html">?</a> </p> </body> </html>