http://qs321.pair.com?node_id=915997

tedv has asked for the wisdom of the Perl Monks concerning the following question:

I'm writing a script that needs to process two XML files in parallel. It needs to take element #1 from file A and element #1 from file B, and output a new element into file C. Then it performs the same operation on element #2, and so on. As a very simple example:


Input file A:
<doc> <elem>A</elem> <elem>B</elem> <elem>C</elem> </doc>
Input file B:
<doc> <elem>1</elem> <elem>5</elem> <elem>10</elem> </doc>
Output file C:
<doc> <elem>A</elem> <elem>BBBBB</elem> <elem>CCCCCCCCCC</elem> </doc>

The catch is that the files are very large, so you cannot parse them all into memory at once. And sadly the XML::Parser interface seems to require parsing the entire first file, handling all callbacks, before you can invoke a call to the parsing of the second file.

Now if this was just a simple text file, the code would be pretty simple. It looks something like this:

# Open both input files open A, "<$file_a" or die "Unable to open $file_a: $!\n"; open B, "<$file_b" or die "Unable to open $file_b: $!\n"; # Process the files parallel while (1) { # Read the lines my $a = <A>; my $b = <B>; # Good coders would check and warn if one entry was defined and the +other # was not, but this is just an example, so you should be happy you e +ven # get comments. last if !defined $a || !defined $b; # Process the output print data_transform($a, $b); } close A; close B;

But because it's XML, everything is more painful. Does anyone know of what might work? Someone suggested XML::Twig, but I'm still reading the documentation to make sure the internal implementation doesn't prohibit this from working.


-Ted