I recently noticed that my perl CGI script for reading an xml file and rendering some of its contents to the browser as HTML was not correctly rendering some foreign language characters. In the course of troubleshooting the issue, I created a simple test page but that appears to have the same issue with the foreign characters such as the e accent aigu appearing correctly in the browser.
#!/usr/bin/perl -w
#use strict;
use CGI::Carp "fatalsToBrowser";
use CGI;
use POSIX qw(ceil floor);
use Cwd;
use Template;
use File::Basename;
# Write output immediately
$|=1;
my $query=new CGI;
my $form=new CGI;
use XML::XSLT;
print <<HTML;
Content-Type: text/html; charset=utf-8
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<html>
HTML
use Encode 'is_utf8';
my $u_temp = "Temperature:350\x{00B0}F html:°";
#print 'Is string utf8?[' . utf8::is_utf8($u_temp) . ']<BR>';
print is_utf8($u_temp) ? 1 : 0, "<BR>";
print $u_temp . '<BR>';
my $price_label = "Price:\x{20AC}9.99";
print $price_label . '<BR>';
my $smiley = "Smiley:\x{263a}";
print $smiley . '<BR>';
my $french_word = "Some French Word: Saut\x{00E9} html:é";
print $french_word . '<BR>';
print '<HR>';
On my browser the euro symbol, the smiley appear and all the html rendered characters appear correctly but the unicode degree and accented e do not appear correctly. The is_utf8() returns 0.
I first suspected the character encoding of the html but my browser does appear to understand the encoding is utf-8.
Any insight as to if my test should work or what else I should be looking at to successfully read a unicode xml file using the xml DOM and render it to the browser would be appreciated.
I believe everything was working fine but my hosting provider recently upgraded to version 5.8.
Thanks in advance.