arcterex has asked for the wisdom of the Perl Monks concerning the following question:

I have a perl server written using the model from the panther book. It has a main listener which looks for requests on port x. When a request comes in on that port it fork()s and continues listening. The forked process responds to requests in a simple read/write fashion (ie: takes the string "LOGIN username" checks the username exists, returns "OK" or "BAD", waits for "PASS password" etc etc). The problem is that while this works fine for 2 or 4 users logged in, memory quickly dissapears on a 64 meg machine when 30 or 40 users are logged in, as each perl server process takes up approx 3 meg. I've trimmed memory usage inside the server itself as much as I can, but it's still sucking up memory. So I'm looking for advice on how to fix this. Some thoughts I've had: IPC or shared memory. Using a single server that does buffering/queuing, or something like that. Having the client proccesses somehow write their instructions to a file and the single main server just reads through that file and responds (and how would it know where to send the responses?) Any thoughts/advice would be greatly appreciated.

Replies are listed 'Best First'.
Re: Lowering memory usage in a perl server
by Anonymous Monk on Feb 25, 2000 at 22:07 UTC
    Could you have one process that does non blocking I/O? The process will keep a list of open connections, and poll each one to see if data has come in. You must use non-blocking I/O and "select" to do this.
RE: Lowering memory usage in a perl server
by sergio (Beadle) on Feb 25, 2000 at 19:35 UTC
    Sometimes if the processing you need to perform is simple and quick you should try to see if a serialized way of handlying the requests is enougth! Another possiblility is serialization is not good enougth is to combine it with a pre-forked strategy (similar to what Apache does) so you spread the load of the serialization by having a set of processes listening to the same socket!