Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Re^3: Perl module to get warn or die to render as HTML

by OfficeLinebacker (Chaplain)
on Sep 26, 2006 at 13:39 UTC ( [id://574931]=note: print w/replies, xml ) Need Help??


in reply to Re^2: Perl module to get warn or die to render as HTML
in thread Perl module to get warn or die to render as HTML

Great point. At my org we have a web server error log and the number of scripts that 'use warnings' that are in production is outrageous.
So also turn off warnings and debugging in general in production so that other web programmers don't have to sift through 100+ lines of Net::FTP
debug output to find the line in the error log from their own script.

T

_________________________________________________________________________________

I like computer programming because it's like Legos for the mind.

  • Comment on Re^3: Perl module to get warn or die to render as HTML

Replies are listed 'Best First'.
Re^4: Perl module to get warn or die to render as HTML
by ptum (Priest) on Sep 26, 2006 at 13:47 UTC

    I'm not sure I really agree with this. While most of my applications are for internal customers (and so I don't worry too much about the point tilly raised), it seems to me that if you have a lot of warning chaff in your logs, the solution is to fix the warnings, not suppress them. I generally leave CGI::Carp directives like 'FatalsToBrowser' in effect for my web apps -- I'd much rather my internal customers get an intelligible error message than something less helpful, if only so that when they e-mail me about it, I can start thinking about the cause of the problem immediately (without sifting through the logs). I think it also helps my customers to see that when errors happen, they're for a reason (and sometimes are because of other dependent systems, and therefore not entirely my fault). :)

    Of course, OfficeLinebacker may have been talking more about suppressing debug stuff, which can be configured nicely with things like Log::Log4perl.

      I don't mind leaving FatalstoBrowser on either because I, too, program mostly just for internal customers. I like your term "warning chaff."
      The stuff I see that irks me is stuff like the following line, repeated with the exact timestamp and line numbers/FH names like 20-50 times:

      [Mon Sep 25 15:33:37 2006] securityQuery.cgi: Use of uninitialized val +ue in concatenation (.) or string at /var/appl/httpd/cgi-bin/ms/secur +ity/securityQuery.cgi line 154, FILE line 122.

      Sorry, feeling a little grumpy today. Also I used to be one of the worse offenders.

      I just thought of one last tip--if you do use CGI:Carp and want to send warnings to the browser, don't forget to put your

      warningsToBrowser(1);
      statement after you've written the headers.

      T.

      _________________________________________________________________________________

      I like computer programming because it's like Legos for the mind.

        Right. But in most cases, warnings like the one you cited can and should be corrected in production code -- if you're trying to manipulate a value and it is undefined, then you haven't really trapped for the possibility that the value you're handling is somehow not yet populated, which can lead to more serious problems.

Re^4: Perl module to get warn or die to render as HTML
by tilly (Archbishop) on Sep 26, 2006 at 18:29 UTC
    I strongly disagree with this.

    Don't silence the messenger. Leave warnings and debugging on. Actively clean up warnings so that you don't have any spurious warnings. Monitor what remains.

    You'll find that a significant fraction of the time the warnings you get will point to bugs that slipped past development and QA. Often enough to justify the rest of the work.

Re^4: Perl module to get warn or die to render as HTML
by valavanp (Curate) on Sep 26, 2006 at 15:21 UTC
    First you should trace out the error log path and see what kind of error it is showing. Than as suggested by monks, you can download the below module from CPAN and use it.

    use CGI::Carp qw(fatalsToBrowser); use strict;

    or if you have more number of lines in your code and to check whether a particular function is been working or not you can see by writing the following 2 lines of code and test.

    print "Content-type:text/html\n\n"; print "here";

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://574931]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others scrutinizing the Monastery: (3)
As of 2024-04-26 05:50 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found