Re: Webhosts: What are reasonable resource limits for Perl?
by dws (Chancellor) on Jan 21, 2002 at 05:11 UTC
|
I thought this was a little restrictive. I was curious to know what sort resources limits other monks had seen on virtual web hosting providers and what they thought was "reasonable".
I've seen similar restrictions (2 CPU seconds or 15-20 clock seconds). In my opinion, when you are on a shared system with a typical $25-$50/month account, those limits are quite reasonable. They help ensure a minimal level of service to others on the box.
If you're doing something with a CGI that takes more than 2 CPU seconds, and have already carefully considered your algorithms and partial-result caching, you ought to consider getting onto your own box, if only out of fairness to others.
My opinion. YMMV.
| [reply] |
|
|
So I guess with those types of web hosts running a script that might exceed those limits (eg: newsletter mailer, http link checker, etc) just isn't an option?
| [reply] |
|
|
| [reply] [d/l] [select] |
Re: Webhosts: What are reasonable resource limits for Perl?
by talexb (Chancellor) on Jan 21, 2002 at 07:47 UTC
|
My web provider pair Networks limits jobs to an 8M working set. Processes going outside those limits are killed. Boom, gone.
Confusingly, pair recently sent out instructions on how to install perl modules on our own accounts. Thrilled, I followed the instructions (I haven't done it myself before), only to find that some of the processes spawned by the installation got themselves killed for using too many resources.
I wrote to support asking them about this Catch-22 situation, and after a week's delay, the reply came back that if I wanted to install my own modules, I was free to compile my own version of Perl. I'm not sure how this makes any difference unless Perl Modules can be compiled into Perl .. but isn't compiling Perl also going to go over my 8M limit?
This is weird stuff.
And to echo what someone else said, my cron processes can ran no more frequently than every two hours, and they strongly recommend these processes are niced when they do run.
Next task: Do a Super Search on compiling Perl. I'll give it a shot.
--t. alex
"Of course, you realize that this means war." -- Bugs Bunny.
| [reply] [d/l] [select] |
Re: Webhosts: What are reasonable resource limits for Perl?
by tstock (Curate) on Jan 21, 2002 at 07:50 UTC
|
Restrictions on shared accounts on pair.com:
* Size of Core Files - 0 MB
* CPU Time Used - 30 seconds
* Data Size - 3 MB
* File Size Created - 1 MB
* Memory Locked - 1 MB
* Number of Open Files - 32
* Number of Simultaneous Processes - 8
"System resource limits are intended to prevent runaway CGI scripts. Also, processes with large memory footprints or hungry CPU requirements will incur swapping and other system slowdowns that reduce server performance."
Sounds reasonable to me :)
Tiago
| [reply] |
|
|
To follow up on that, the reply from pair Networks support told me to look at this page for resource limits. The memory limit there is 8M, and that's what I was quoting.
It just plain confuses me that I can compile Perl in its entirety (just did it this morning -- only the install failed, because I can't write to /usr/local/bin) .. yet I cannot install a Perl module. It seems odd that a building a module takes more resources than Perl itself.
I guess I still have a whack of learning to do. :)
--t. alex
"Of course, you realize that this means war." -- Bugs Bunny.
| [reply] [d/l] |
|
|
First of all I hope you know that you can reinstall Perl
quite easily with the customized version written to your
own home directory.
Secondly at a guess your problem on install was using
the CPAN module. While nice, it unfortunately routinely
uses at least 16 MB on my machine. However downloading and
installing modules manually should work just fine. And if
all else fails, you can install them elsewhere then just
copy them over. (Works for pure Perl modules quite well.
Compiled ones, not always so.)
| [reply] |
|
|
| [reply] |
|
|
We're actually working on revising our process controls, which might include raising some of these limits (they certainly will never be lowered).
Who is we? Matt, do you work at pair Networks?
--t. alex
"Of course, you realize that this means war." -- Bugs Bunny.
| [reply] [d/l] |
|
|
|
|
Re: Webhosts: What are reasonable resource limits for Perl?
by vagnerr (Prior) on Jan 21, 2002 at 17:07 UTC
|
I did some research, contacted their web host, and found out that their virtual web hosting provider would automatically kill Perl processes after a minimum of 2 CPU seconds or 20 clock seconds.
From the point of view of you the programmer it does seem very restrictive that your perl masterpiece is permitted only 2 CPU seconds of fame. Hoever consider it from the point of view of (most) end users browsing your site 2 seconds is a long time. People get itchy after 1 second of real time, and are ready to hit reload/go somewhere else after just four. Its sad realy I remember when I was quite happy to wait 30 seconds+ just to get a pure text page. These days people expect full page flash graphics in a fraction of that time
---
If it doesn't fit use a bigger hammer | [reply] |
Re: Webhosts: What are reasonable resource limits for Perl?
by mr.dunstan (Monk) on Jan 22, 2002 at 06:34 UTC
|
If this is a site being served to the general public you should be shooting for sub-second response times. Otherwise 2 seconds sounds normal for shared hosting.
Maybe it's time to start thinking private server and mod_perl!
-mr.dunstan | [reply] |