in reply to Re: Errors uncaught by CGI::Carp
in thread Errors uncaught by CGI::Carp

There are many ways for a script not to return valid HTTP headers in an intermittent way. The most obvious one is an intermittent restriction of some O/S level resource: open files, available RAM, etc.

Good point as usual hippo.
However, the examples you give would cause the other sites hosted on the same machine, including the test site for this domain, to give random error. They don't - or at least I have not experienced them. It is possible that they do as it is an intermittent problem but it seems unlikely that I would only notice it on one site and no others.

Are you still running on shared hosting? If so, the real error log is unlikely to be accessible to you...

Yes, it is on shared hosting with cPanel

According to the cPanel documentation, the error log I can see comes from /usr/local/apache/logs/error_log.

Replies are listed 'Best First'.
Re^3: Errors uncaught by CGI::Carp
by hippo (Archbishop) on Oct 14, 2021 at 21:42 UTC

    Not all resource limits are global. At $WORK we impose per-user limits (effectively per-site limits) on our shared hosting customers precisely so that one poorly-written or heavily-hit site doesn't adversely impact upon the rest. I would expect every other shared hoster to implement something similar. This would explain why you only see it in production - that site will (hopefully) receive orders of magnitude more traffic than your test/demo site.


    🦛

      I am sure there are per-user resource limits. In fact I know there are because the host documents then. But I'd assumed they would cover all my sites within the same account. I have a hosting account with add-on domains. Both the production and test sites are add-on domains and there are others. One of the others gets much more traffic than the one that is generating errors - sometimes up to 50 times more traffic when we have some ad campaigns running and they don't give an error.

      I've checked the bandwidth logs and we never go above 15% of the allowance.

      Not all resource limits are global. At $WORK we impose per-user limits

      At $work do you throw a 500 error if a user exceeds their resources or do you send something more informative to the browser?

        Depends which resource is being exceeded. Certainly if they try to exceed their RAM or CPU limits then the process will just die because there is no other action which could be taken and that will result in a 500, but what gets to the browser depends on what they've already sent by that stage. Exceeding other limits mostly just results in that action failing (eg. if they try to open a file past that limit then the file will not open so their code should test for that (as it always should anyway)).


        🦛