Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

"close" failing

by munishdev (Initiate)
on May 31, 2006 at 12:12 UTC ( [id://552764]=perlquestion: print w/replies, xml ) Need Help??

munishdev has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

my script is something like this :-

open (SNMP1, "abc.sh |") or die "Cannot run abc.sh $!"; while (<SNMP1>) { # do some processing } close (SNMP1) or die "ERROR: testing $! $?\n ";

When i execute this from command prompt, it runs fine. But when I run it from the web broswer, "close (SNMP1)" is failing. Since it is run from web, I don't get to see any error message but statements after "close (SNMP1)" never get exectuted.

Is it advisable to use "die" or we can avoid it. Because if I don't know "die" in "close (SNMP1)", it works fine.

thanks a lot in advance.

-munish-

Edited by planetscape - added code tags and rudimentary formatting

( keep:0 edit:41 reap:0 )

Replies are listed 'Best First'.
Re: "close" failing
by derby (Abbot) on May 31, 2006 at 12:32 UTC

    Since it is run from web, I don't get to see any error message

    Then you either need to check your web server's error log or use CGI::Carp.

    -derby

      Specifically, use the following:

      use CGI::Carp qw(fatalsToBrowser warningsToBrowser); use CGI qw(:standard); print header; warningsToBrowser(1);

      This should give you some visibility into the cause of the problem.


      No good deed goes unpunished. -- (attributed to) Oscar Wilde
        Also, the interesting error messages are probably being printed from abc.sh. They should end up in the server's error log. You can use IPC::Open3 to get a handle from which you can read them.
Re: "close" failing
by gellyfish (Monsignor) on May 31, 2006 at 12:18 UTC

    You probably do want to check the success of the close when you are doing a piped open like that. The likelihood is that the $ENV{PATH} is different for the web server process: you should either specify the full path to the program you are running or ensure that the location of the program is in $ENV{PATH} before opening the pipe. The reason you aren't seeing any error message is because the output of die is going to STDERR and mostly this goes into the error log rather than to the browser.

    /J\

      I don't think that $ENV{PATH} has anything to do with the failure of the close statement. It would affect the success of open, but not close. Or am I missing something?

      Remember: There's always one more bug.

        The return value of a piped open tells you if the fork succeeded or not, the return value of close tells you about the execution status of the program in the pipe. So, if $ENV{PATH} is set such that perl can't find the program to be piped to/from, close will return a false value (perl was able to successfully fork, but not execute the piped command).

        thanks a lot for the replies. You are right here. Because the print statements worked fine after the open call. (in "do processing" section). So, it cannot be that open has failed.
Re: "close" failing
by jdhedden (Deacon) on May 31, 2006 at 16:00 UTC
    The close command returns the exit status of the piped command.

    If the piped command has exited with status=0 before close is called, then close will succeed.

    If it exits with status != 0 before close is called, then close will fail and $? will equal the exit status.

    If close is called before the piped command exits, then close will fail and $? will equal SIGPIPE. Why? Because Perl sends SIGPIPE to the piped command, then waits for it to exit, and then grabs its exit status which is the signal caught by the piped command (i.e., SIGPIPE). (Note that it is possible for the piped command to intercept the SIGPIPE signal and exit with some other exit status (e.g., exit 0) which will be picked up instead.)

    In the case you experienced with the web server, it may be that the CGI is getting to the close command before the piped command actually exits. If you insist on testing the status of the close command, then modify your piped command to intercept SIGPIPE and terminate with a 0 exit status.


    Remember: There's always one more bug.
Re: "close" failing
by Herkum (Parson) on May 31, 2006 at 12:55 UTC

    When you are opening, printing to or closing a file you should always die. There can be things going on with a file besides you trying to write to it (for example someone deletes it while you still have it open). If you die it will tell you what it thinks is going on and gives you a hint that there may be something there that you need to look into.

      You don't have to die always. $! will tell you the error.
      Also, in a web app it would be nice to inform the user that something is going wrong, instead of printing 'Internal Server Error' or sth similar.
        Another strange thing i found is that if i add ">", in front of abc.sh, it works for me. Can you tell what will ">" do to my script? open (SNMP1, ">abc.sh |") or die "Cannot run abc.sh $!"; while (<SNMP1>) { # do some processing } close (SNMP1) or die "ERROR: testing $! $?\n ";
Re: "close" failing
by ahmad (Hermit) on May 31, 2006 at 15:14 UTC

    Hello ,

    if you want to check for errors , you could use one of the following ways

    # output fatal errors to browser use CGI::Carp qw(fatalsToBrowser); # or just change your die statment to print such like open(FH,"$filename |") or print "can't open file $!";
A reply falls below the community's threshold of quality. You may see it by logging in.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://552764]
Approved by sunadmn
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others contemplating the Monastery: (6)
As of 2024-04-19 13:21 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found