I've heart of hosters that limit CGI resources, esp. with cheap tariffs, but I couldn't find a statement more specific than this one:
We reserve the right to deactivate scripts that cause disruption to our servers.
However, the symptoms you have described indicate that some kind of limitation is in place.
Perhaps you can include this snippet into your CGI script and see if that returns something useful in the HTML source?
$|=1;
print "<!-- LIMIT DEBUG: \n", qx{ulimit -a 2>&1}, "\n-->\n"; #-- sh
+ow process limitations (unix)
$SIG{PIPE} = sub { print "<!-- SIGPIPE! -->\n"; }; #-- co
+uld be caused by scraping
Another option: use CGI::Carp 'fatalsToBrowser';
Otherwise, ask support.
Update: Following my own advice, I noticed that ulimit didn't work as advertised (returned nothing) :(
Finally, this one worked with my setup (OpenShift), YMMV:
print "<!-- ", qx{cat /proc/$$/limits 2>&1}, " -->\n";
|