http://qs321.pair.com?node_id=488824

I'm very much enjoying reading Perl Best Practices. I find myself scribbling notes after almost every page; mainly concrete ideas for improving the quality and especially the maintainability of production Perl code at work. To me, this is the most important Perl book to be published for years, because it helps me sell Perl as a maintainable language to management.

However, the In-situ Arguments ("Allow the same filename to be specified for both input and output") practice described on page 304 in chapter 14 has me scratching my head, for it seems to me to be more a "dangerous practice" than a "best practice".

Here is a test program, derived from the example given in the book:

# The idea is to use the Unix unlink trick to write the # destination file without clobbering the source file # (in the case where the source and destination are the same file). use strict; use warnings; my $source_file = 'fred.tmp'; my $destination_file = $source_file; # Open both filehandles... use Fatal qw( open ); open my $src, '<', $source_file; unlink $destination_file; open my $dest, '>', $destination_file; # Read, process, and output data, line-by-line... while (my $line = <$src>) { print {$dest} transform($line); } # This is my test version of the transform() function; # the sleep is there for convenience in testing what happens # if you interrupt proceedings mid stream by pressing CTRL-C. sub transform { sleep 1; return "hello:" . $_[0]; }

My problems with this code are:

As discussed in Re-runnably editing a file in place, it seems sounder to first write a temporary file. Once you're sure the temporary file has been written without error (and after the permissions on the temporary are updated to match the original) you then (atomically) rename the temporary file to the original. In that way, if writing the new file is interrupted for any reason, you can simply re-run the program without losing any data.

Please let me know what I've overlooked.