http://qs321.pair.com?node_id=413719

Limbic~Region has asked for the wisdom of the Perl Monks concerning the following question:

All,
There is a long story behind this that involves a Java programmer asking for some help with Perl. I won't get into the particulars other than to say the question asked was:

What's the easiest way to loop through a comma delimited file and append the line minus 1 column into a new file that is the same name as the excluded column?

The file in question was about 20 lines long. I gave my disclaimer about normally using a module to handle CSV, but the following code should work:

#!/usr/bin/perl use strict; use warnings; while ( <DATA> ) { my @field = split /,/; my $file = splice @field, 2, 1; open (OUTPUT, '>>', $file) or die $!; print OUTPUT join ',', @field; } __DATA__ 1,2,foo,3 4,5,bar,6 7,8,foo,9
I asked the Java programmer the next day how it worked and I was informed that it was too slow and that a Java program was being written instead. Scratching my head, I asked if the same file I was shown before was the one actually being used. It wasn't - multiple files millions of lines long each. Opening and closing file(s) that many times is bound to be slow. I offered the following modification* of the code provided the column being excluded was fairly repetetive in the file:
#!/usr/bin/perl use strict; use warnings; my %fh; while ( <DATA> ) { my @field = split /,/; my $file = splice @field, 2, 1; if ( ! $fh{$file} ) { open ($fh{$file}, '>>', $file) or die $!; } print { $fh{$file} } join ',', @field; } __DATA__ 1,2,foo,3 4,5,bar,6 7,8,foo,9
I explained that the reason for the disclaimer was that that the hash only bought performance if a file had more than 1 line getting appended to it. Additionally, if there are too many unique files, memory and/or open file descriptors may cause a problem. I was then told that the Java code was nearly done but thanks anyway. *shrug* - exit stage right.

I think I am missing how Java is going to be that much faster. I assume Java is still going to open and close the file each time through the loop unless there is a similar trick. Given that I don't really know Java I could be out in left field here.

Leaving Java aside, is there more run-time efficient way than my second suggestion in Perl? I haven't given it a lot of thought because the Java developer is just being silly. It is a run one and done script so it would already be finished if the first version (wrapped in a tiny shell script) had been allowed to run. On the other hand, this is the sort of thing that I like to be aware of in the future. (Prior Planning Prevents Poor Performance)**

Cheers - L~R

* The actual code used ARGV
** I learned this in the military, but there were a couple extra explicitive Ps