Elliott has asked for the wisdom of the Perl Monks concerning the following question:

We are always told to switch on the warnings flag - and that strikes me as advice worth following... but how?

If, like me, you are running CGI code on a remote server, how do you direct the warnings to print to a web page rather onto some poor sysadmin's error log (or wherever they disappear to)?

I am learning to get into the habit of putting "Content-type..." into my die strings. Now it's time for the next step...

  • Comment on Running with warnings on a remote server

Replies are listed 'Best First'.
Re: Running with warnings on a remote server
by rob_au (Abbot) on Nov 04, 2001 at 19:02 UTC
    There are a couple of ways which you could achieve this ...

    • If all you want to do is capture fatal errors and events, you should have a look at CGI::Carp which allows you to redirect fatal error messages to the web client in the following fashion:

      use CGI::Carp qw/fatalsToBrowser/; die "Something bad happened";

      Fatal errors will now be echoed to the browser as well as to the HTTPD error log. CGI::Carp sends a minimal HTTP header to the browser so that even errors that occur in the early compile phase will be seen. However, with this method, non-fatal errors will still be directed only to the HTTPD error log (see below).

    • Non-fatal warning errors can be trapped through the trapping of the __WARN__ signal for which custom handler subroutines can be executed. eg.

      $SIG{__WARN__} = sub { print STDOUT "Content-Type: text/plain\n\n"; print STDOUT @_, "\n"; exit 0; };

      This of course could be combined with the CGI::Carp module as follows:

      use CGI::Carp qw/fatalsToBrowser/; $SIG{__WARN__} = sub { die @_; };

      The problem however with this approach (and indeed the other anonymous subroutine described) is that non-fatal script warnings are treated as fatal errors - This of course will tighten up your code, but is admittedly quite a zealous approach to the matter. A cleaner method may be the incorporation of this trapping into an email reporting function, which sends the script warnings (along with a dump of environment variables and configuration arguments) to you via email rather than to the browser screen in a fatalistic fashion. eg.

      use CGI::Carp qw/fatalsToBrowser/; use Mail::Mailer; # Rough code follows, not to be used in current form $SIG{__WARN__} = sub { my $smtp = Mail::Mailer->new("smtp", Server => "127.0.0.1"); $smtp->open({ 'To' => 'webmaster@your.domain.com', 'From' => 'nobody@your.domain.com' }); print $smtp @_, "\n"; $smtp->close; };

     

    Ooohhh, Rob no beer function well without!

      CGI::Carp is actually a bit more powerful than that, even. Some examples (some from the docs):
      # Send warn/die messages to a separate log file BEGIN { use CGI::Carp qw(carpout); open(LOG, ">>/usr/local/cgi-logs/mycgi-log") or die("Unable to open mycgi-log: $!\n"); carpout(LOG); } # Or, normal fatalsToBrowser behavior but catch warnings also use CGI::Carp qw(fatalsToBrowser warningsToBrowser); use CGI qw(:standard); print header(); warningsToBrowser(1); # enable warnings to browser
      Nifty stuff!
Re: Running with warnings on a remote server
by joealba (Hermit) on Nov 04, 2001 at 18:22 UTC
    Well, a UNIX logging server would probably be the best solution.

    Next best: I'm sure one of the UberMonks will post some kind of cool Carp overloading scheme that forks off a process to do error logging and wait for an exclusive lock on the file, without holding up your script.

    But then there's my little kludge: I just set up an "errors" directory in each of my script directories, owned by the webuser.
    # TOP OF SCRIPT use File::Basename; my ($bname, $bpath, $bsuffix) = fileparse($0,'.pl','.cgi'); my $LOGFILE = "errors/$bname.txt"; if (-s $LOGFILE > 10000) { unlink $LOGFILE; } open (STDERR, ">>$LOGFILE") or die "Couldn't open script log: $!\n";
    It's not perfect, and race conditions exist. But, it works well enough for my needs -- I'm mainly concerned with just redirecting the errors so I don't get the evil 500, not so much with keeping track of the errors.

    So, it would be easy enough to toss these errors out to a web-accessible directory and accomplish what you want.
Re: Running with warnings on a remote server
by Aristotle (Chancellor) on Nov 05, 2001 at 02:46 UTC
    The suggestions here are all well and good, but IMO the correct way is to develop the script on a local server, taking note of the various warnings that appear in regular operation, and either eliminating their cause (like mentioning variables in void context to get rid of only used once warnings) or selectively placing { no warnings 'whichever_warning_type_crops_up'; '...'; } (or for older versions, { local $^W = 0; '...'; } ) blocks around any code you know to cause warnings during regular operation.

    Bottom line, unless the script dies horribly, running it should never produce warnings. (This is in fact a rule I stick to even outside of CGI scripts - the only time I disregard it is in throw-away scripts written for a single task at hand.)
      It's all very well saying that the correct way to develop the script is on a local server - if you have one. If I had one, I wouldn't have the problem in the first place!

      I hope everyone who turns on the warnings flag does it in order to clean up their code rather than for the entertainment of their users!

      Thanks everyone for the information - excellent and detailed as usual. Praise be! For the Monks are great!

        Never fear, CGI is here! :-) That is, unless you're trying to write stuff that fiddles with stuff tied to a server (mod_perl handlers for certain request phases f.ex). If you're just writing a plain ol' CGI script, its debug mode should be of great help, letting you run CGI scripts as if they were regular ones. Of course, it's more tiresome to debug them this way, as opposed to clicking refresh on your browser..