I wrote the following benchmark to prove to myself that my
JSON::XS installation was working well enough to replace my homebrew config file handling code. This is on a quad-core Xeon 3 GHz server running Fedora 7. The config format I use is very simple:
key: value
key:: multiline
value
(blank line)
JSON is obviously more involved, but could instantiate key values as hashes or arrays where I have to subsequently split strings to make a hash or array which happens a lot with my production codebase. Here's the benchmark script:
#!/usr/bin/perl
# jsonbench - which is faster, load_config or JSON::XS
use strict;
use warnings;
use vars qw/%cfg $fileConfig $fileJson/;
use Benchmark qw/:all/;
use Fcntl qw/LOCK_EX LOCK_UN/;
use JSON;
print JSON->backend->is_xs ? 'JSON::XS' : 'JSON:PP',"\n";
$fileConfig='local.dat';
$fileJson='local.json';
unless (-r $fileConfig) { # create original config file
for my $n1 ('a'..'z') {
for my $n2 ('a'..'z') {
for my $n3 ('a'..'z') {
$cfg{"$n1$n2$n3"}="Test 1 2 3";
}
}
}
save_config();
}
unless (-r $fileJson) { # create json file from fileConfig
load_config();
my $json=JSON->new->encode(\%cfg);
die "No data" unless $json;
open(FILE,'>',$fileJson) || die "$fileJson: $!";
print FILE $json;
close FILE;
}
cmpthese( 1000, {
'load_json' => \&load_json,
'load_config' => \&load_config
});
exit;
sub load_json {
%cfg=();
open(FILE,$fileJson) || die("Can't open $fileJson: $!");
undef $/;
my $json=<FILE>;
close FILE;
%cfg=%{JSON->new->decode($json)};
}
sub load_config {
%cfg=();
open(CFGFILE,$fileConfig) || die("Can't open $fileConfig: $!");
flock(CFGFILE,LOCK_EX);
my $key;
while (<CFGFILE>) {
chop;
if (!$_ and $key) { # end of multiline definition
$key='';
}
elsif (/^([^:]+)::\s+(.*)$/) { # multiline definition
$key=$1;
$cfg{$key}=$2;
}
elsif ($key) {
$cfg{$key}.="\n$_";
}
elsif (/^([^:]+):\s+(.*)$/) { # single-line definition
$cfg{$1}=$2;
}
}
flock(CFGFILE,LOCK_UN);
close CFGFILE;
}
sub save_config {
umask 0111;
open(CFGFILE,'>',$fileConfig) || die("Couldn't write $fileConfig: $
+!");
flock(CFGFILE,LOCK_EX);
print CFGFILE "# $fileConfig, last modified ".scalar(localtime)."\n
+";
for my $key (keys %cfg) {
if (index($cfg{$key},"\n")>-1) {
print CFGFILE "$key\:: $cfg{$key}\n\n";
} else {
print CFGFILE "$key: $cfg{$key}\n";
}
}
flock(CFGFILE,LOCK_UN);
close CFGFILE;
}
The results of 1000 runs:
JSON::XS
Rate load_config load_json
load_config 13.7/s -- -37%
load_json 21.8/s 59% --
So, fellow Monks, why is load_json so unimpressively superior? Shouldn't it be freakin' blazing compared to my lame pure perl code? Note JSON::XS is definitely loaded and is being used. The generated files are only slightly bigger for JSON:
-rw-rw-rw- 1 sflitman sflitman 281268 2010-05-30 16:09 local.dat
-rw-rw-rw- 1 sflitman sflitman 333945 2010-05-30 16:09 local.json
As always, all comments are welcome, I'm probably doing something stupid. Also, can anyone tell me how to load an existing hash with JSON rather than the hash assignment in load_json above?
SSF