I have a perl server written using the model from the panther book. It has a main listener which looks for requests on port x. When a request comes in on that port it fork()s and continues listening. The forked process responds to requests in a simple read/write fashion (ie: takes the string "LOGIN username" checks the username exists, returns "OK" or "BAD", waits for "PASS password" etc etc).
The problem is that while this works fine for 2 or 4 users logged in, memory quickly dissapears on a 64 meg machine when 30 or 40 users are logged in, as each perl server process takes up approx 3 meg.
I've trimmed memory usage inside the server itself as much as I can, but it's still sucking up memory. So I'm looking for advice on how to fix this. Some thoughts I've had:
IPC or shared memory.
Using a single server that does buffering/queuing, or something like that.
Having the client proccesses somehow write their instructions to a file and the single main server just reads through that file and responds (and how would it know where to send the responses?)
Any thoughts/advice would be greatly appreciated.