in reply to HTTP::Daemon cannot support keep alive properly?

In order to handle multiple concurrent requests, you'll have to fork a new process. HTTP::Daemon does not do this for you. From the doc:

This HTTP daemon does not fork(2) for you. Your application, i.e. the +user of the HTTP::Daemon is reponsible for forking if that is desirab +le.

You'll need to add something like this (untested):

my $d = HTTP::Daemon->new || die; while (my $c = $d->accept) { my $childPID; unless (defined $childPID = fork) { # error handling here... next; } if ($childPID == 0) { # we're the child, # your processing code here... } # we're the parent, go back and wait # for another request... }

That should give you an idea of what you'll need to do.

UPDATE - Disregard - I had the wrong end of the stick entirely :-)

-- vek --

Replies are listed 'Best First'.
Re: Re: HTTP::Daemon cannot support keep alive properly?
by pg (Canon) on Oct 11, 2003 at 21:11 UTC

    Thanks vek, but you mis-interpreted the issue, and what you said is not the point here.

    I knew that, to process multiple connections AT THE SAME, I need to fork or using multi-thread, and that's what I did.

    The problem here is a different turkey (is it because Thanksgiving is coming? ;-). HTTP/1.1 allows the client (usually a browser) to send multiple HTTP requests over the same connection (this is called keep alive), and they comes obviously in sequence as those are from the same client (unless the client does multi-threading, but even this is the case, it is still not a problem). On the server site, there is no need for multi-threading for this PARTICULAR issue, especially when you sepcify socket queue size greater than 1 for example 20 something. Even if the requests over the same connection come in at the same time, they will just be queued, ready for being processed later, and no harm is done.

    But thanks for your idea any way, have a great day.