http://qs321.pair.com?node_id=11141789

fireblood has asked for the wisdom of the Perl Monks concerning the following question:

Dear wise ones,

I have a "bad file descriptor" problem in one of my Perl programs that I’ve been unable to resolve. I've looked at the other topics on perlmonks and other fora with bad file descriptor in their descriptions but didn't see enough similarities to my problem to resolve it.

My program has the following initial instructions in it:

unless (open ($fh_debug, ">>", $self -> {debug_file_name})) { print STDERR "\nCould not open \$fh_debug: $!\n"; die; }

The open works just fine, and throughout the program I am able to write many messages to the file referenced by $fh_debug.

At the end of the program, I have the following instructions:

line_no 480 print $fh_debug "jj\n"; 481 482 print $fh_debug "\$fh_debug is defined\n" if (defined ($fh_debu +g)); 483 484 if (close ($fh_debug)) 485 486 { 487 <do some stuff>; 488 } 489 490 else 491 492 { 493 print STDERR "\nCould not close \$fh_debug: $!\n"; 494 die; 495 }

When I invoke the module in which this code resides from Strawberry Perl v5.32.1, it works exactly as expected. The last lines in the debug file are

jj $fh_debug is defined

and the program finishes nicely and cleanly. But when I invoke the module from Linux Perl v5.34.0, I again get those same two lines at the end of the debug file, and then get the following message:

Could not close $fh_debug: Bad file descriptor Died at /u1/stat/global/bin/perllib/master_log.pm line 494. Can't use an undefined value as a symbol reference at /u1/stat/global/ +bin/perllib/master_log.pm line 480.

What could be causing this? Why would it work perfectly under Strawberry Perl but seem to write its final messages correctly and then paradoxically issue those error messages under (Red Hat 7) Linux Perl?

Thank you.

P.S. How can I prevent the lines between my < code > and < /code > tags from wrapping prematurely?

Replies are listed 'Best First'.
Re: Bad file descriptor when trying to close file handle
by tybalt89 (Monsignor) on Mar 03, 2022 at 17:01 UTC

    You are checking the wrong thing for defined().

    #!/usr/bin/perl use strict; # https://perlmonks.org/?node_id=11141789 use warnings; my $filename = 'tmp.11141789'; open my $fh, '>>', $filename or die "$! opening $filename"; printf "1) fileno = %s\n", $fh->fileno // 'undefined'; close $fh or die "$! closing $filename"; printf "2) fileno = %s\n", $fh->fileno // 'undefined'; close $fh or die "$! closing $filename"; printf "3) fileno = %s\n", $fh->fileno // 'undefined'; close $fh or die "$! closing $filename";

    Outputs:

    1) fileno = 3 2) fileno = undefined Bad file descriptor closing tmp.11141789 at ./pm11141789.pl line 12.

    This was run on ArchLinux, I don't have a Strawberry Perl system to test on.

Re: Bad file descriptor when trying to close file handle
by Discipulus (Canon) on Mar 03, 2022 at 11:32 UTC
    Hello fireblood,

    without something small to reproduce your error I can merely shoot in the dark.. but I'm lucky at this sometimes:)

    Is the filhandle select -ed somewhere? Is fileno returning something meaningful just before the call to close? use diagnostics tell you something more?

    It seems not be the case, but: is your handle a pipe open?

    from open docs:

    > If the filehandle came from a piped open, close returns false if one of the other syscalls involved fails or if its program exits with non-zero status.

    There is something weird in the environment of the linux box? There are threads involved?

    Also inspecting $^E can help at OS level

    L*

    There are no rules, there are no thumbs..
    Reinvent the wheel, then learn The Wheel; may be one day you reinvent one of THE WHEELS.
Re: Bad file descriptor when trying to close file handle
by choroba (Cardinal) on Mar 03, 2022 at 11:37 UTC
    Are you using threads or fork?

    map{substr$_->[0],$_->[1]||0,1}[\*||{},3],[[]],[ref qr-1,-,-1],[{}],[sub{}^*ARGV,3]
Re: Bad file descriptor when trying to close file handle
by ikegami (Patriarch) on Mar 03, 2022 at 14:35 UTC

    Notice that after you die, you try to execute that block of code an additional time. ("Can't use an undefined value ... at ... line 480.") That means you try to close the file handle more than once.

    We know you try to close the file handle twice. Or rather, at least twice. I suspect there are actually three attempts. Specifically, I suspect you've tried one once before the attempt that died, and that the first attempt was successful.

      Hi, I read your post several times, focusing on your comment "Notice that after you die, you try to execute that block of code an additional time.". I could not see what you were referring to -- after I die, there are no more instructions. And "die" terminates the execution of the program anyway, so I'm not clear as to how after dying I could possibly try to execute that block of code an additional time. When the die instruction is in fact executed when the invoker calls this module a second time, the program totally stops running. Where would I notice that after I die I try to execute that block of code an additional time? I'm drawing a blank on it. Thanks much.

        I could not see what you were referring to -- after I die, there are no more instructions.

        You have Can't use an undefined value as a symbol reference at /u1/stat/global/bin/perllib/master_log.pm line 480. after Died at /u1/stat/global/bin/perllib/master_log.pm line 494.

        I'm not clear as to how after dying I could possibly try to execute that block of code an additional time.

        Neither are we, since you didn't show us. Maybe you catch the exception? Maybe you call it from a destructor or END block?

Re: Bad file descriptor when trying to close file handle
by LanX (Saint) on Mar 03, 2022 at 11:41 UTC
    my guess are scoping problems, since there is no my $fh_debug

    do you use strict and warnings ?

    > P.S. How can I prevent the lines between my < code > and < /code > tags from wrapping prematurely?

    you don't. The wrapping depends on the Display Settings of the users reading it.

    Cheers Rolf
    (addicted to the Perl Programming Language :)
    Wikisyntax for the Monastery

Re: Bad file descriptor when trying to close file handle
by LanX (Saint) on Mar 03, 2022 at 12:06 UTC
    > But when I invoke the module from Linux Perl v5.34.0,

    another guess: Are multiple processes simultaneously writing at or even deleting, renaming or removing the file?

    Bad file descriptor is an OS/FS error, not Perl.

    Cheers Rolf
    (addicted to the Perl Programming Language :)
    Wikisyntax for the Monastery

      No, changes to the directory entry that was used to open the file (incl renaming and deleting) would not result in that error.

      Unix: Doing these things has no effect on the file, to which one can continue writing. It would not produce an error at all.

      Windows: File handles opened by Perl don't even allow those operations to occur. (This is standard.)

        I tested it out, you are right, the only way to reproduce this was closing the same FH twice. (at least I think, Ubuntu is giving me German error messages)°

        > to which one can continue writing.

        well kind of, printing to an unlinked file in ">>" mode seems to do nothing, but no error message either.

        (Tho I only did a quick test in the debugger...)

        Cheers Rolf
        (addicted to the Perl Programming Language :)
        Wikisyntax for the Monastery

        °) it's unfortunate we can't add these kind of errors to perldiag

Re: Bad file descriptor when trying to close file handle
by fireblood (Scribe) on Mar 04, 2022 at 20:06 UTC

    Dear wise monks,

    In follow-up to your feedback I updated my code as follows:

    line_no 480 my $subroutine_name = (caller(0))[3]; 481 print “Arrived in the $subroutine_name subroutine.\n”; 482 483 print $fh_debug "jj\n"; 484 485 print $fh_debug "\$fh_debug is defined\n" if (defined ($fh_ +debug)); 486 487 if (close ($fh_debug)) 488 489 { 490 print “The close function call was successful.\ +n”; 491 <do some stuff>; 492 } 493 494 else 495 496 { 497 print STDERR "\nCould not close \$fh_debug: $!\n"; 498 die; 499 }

    The output then became the following:

    Arrived in the custom_functions::master_log subroutine. The close function call was successful. Arrived in the custom_functions::master_log subroutine. Could not close $fh_debug: Bad file descriptor Died at /u1/stat/global/bin/perllib/master_log.pm line 498. Can't use an undefined value as a symbol reference at /u1/stat/global/ +bin/perllib/master_log.pm line 483.

    Well, that confirmed what most of you had suspected, that somehow I had called the close function more than once. It was not evident in the code of the module. There were no threads or forks involved. The variable $fh_debug had in fact been declared as an "our" variable higher up in the module which wasn't shown, and I was by intent using the expression "defined ($fh_debug)" as a Boolean to indicate whether or not an optional debugging file had been instantiated in the new () constructor for the module. So those considerations seemed okay. But given your feedback, I modified the code again as follows:

    480 my $subroutine_name = (caller(0))[3]; 481 my $invoker = (caller(1))[3]; 482 print “Arrived in the $subroutine_name subroutine.\n”; 483 print “My invoker is $invoker.\n”;

    The output from that second modification confirmed that the subroutine was not being called twice within the same module, it was being called more than once by the invoker. That allowed me to refocus my attention entirely from the module code to the invoker code, where I found that indeed, the invoker was the guilty one, calling the module to do the close more than once. Once I corrected the invoker code the problem disappeared.

    I had been studying the module code for days trying to figure out why a call to a simple file close operation could trigger the error messages that I had been experiencing. It wasn’t until posting here and reading your comments that I was able to diagnose what was really going on and resolve the problem.

    Thanks much for your feedback.