I have a perl script that uses cgi.pm to take some input, do some (fairly heavy) calculations and return a few lines of calculated answers.
Everything works fine on my local Apache web servers, but after uploading the script to my ISP to make it available to the net, the process dies for long calcs.
Any time the calc time goes over about 2 min, the page just stops uploading without any errors and says done. Is there a strategy I can use to find out what kind of limit I am running up against? Perhaps they have some kind of time or cpu usage limit implemented? My script is recursive so it could be easily "parallelized". Should I try fork()?
Perhaps there is a script available that will send to the web client a "top"-like real time display?
Thanks,
drinkd
In reply to ISP process limits by drinkd
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |