in reply to Closing Connections using HTTP::Daemon

Bummer, cannot get your error to repeat either on Win2000 or WinXP. If I leave the line in and check the thread count in Windows Task Manager the count goes up with almost every call. With it removed, however, it works fine for me (I had to emulate your handling functions since you didn't post them):
#... if ($req_method eq 'POST') { DoPost($connex, $peeraddr, $req_content, $req_uri) } elsif ($req_method eq 'GET') { DoGet($connex, $peeraddr, $req_uri) } else { SendNotFound($connex) } } print "Connection $connectno finished.\n"; #sleep 1 while ($connex->connected); $connex->close; print "Connection $connectno closed.\n\n"; } sub DoPost { #dummy Function my ($c,$p,$req,$req_uri) = @_; $c->send_file_response(qq|c:\\temp\\foo.dat|); } sub DoGet { #dummy function my ($c, $p, $uri) = @_; $c->send_file_response(qq|c:\\temp\\foo.dat|); } sub SendNotFound { #dummy function my ($c) = @_; $c->send_file_response(qq|c:\\temp\\foo.dat|); } &RunServer; 1;

I got:
... Connection 69 at Fri Nov 11 16:10:53 2005 from 127.0.0.1 Connection: 69 Request: GET / Connection 69 finished. Connection 69 closed. Connection 70 at Fri Nov 11 16:10:53 2005 from 127.0.0.1 Connection: 70 Request: GET / Connection 70 finished. Connection 70 closed. ...
With the correct output to the browser/LWP (command shell) each and every time. Perhaps something else in your functions is causing these threads to kill the parent process? Can you post some more of that?

Update: I took a closer look at my Task Manager and noticed 100% utilization with your code in that IO::Select loop, so I rewrote it a bit (using the perldoc example from HTTP::Daemon):

#!/usr/bin/perl -w use strict; use threads; use HTTP::Daemon; $|++; my $coreDaemon = HTTP::Daemon->new(LocalPort=>90) or die $!; my $connectno = 0; print "Listening..."; while(my $c = $coreDaemon->accept){ while (my $r = $c->get_request) { if ($r->method eq 'GET') { my ($thrd) = threads->create(\&GetHandler,$c, ++$connectno); $thrd->detach; #$c->send_file_response("c://temp//foo.dat"); } else { #$c->send_error(RC_FORBIDDEN) $c->send_file_response("c://temp//failed.dat"); } } } #from the old code sub GetHandler { my ($connex, $connectno) = @_; my $peeraddr = $connex->peeraddr; prin2log( "Connection $connectno started."); $connex->send_file_response("c://temp//foo.dat"); prin2log( "Connection $connectno finished."); #sleep 1 while ($connex->connected); $connex->close; prin2log( "Connection $connectno closed.\n"); } #STDOUT messages will now go to this (non-filelocked) file sub prin2log { my ($str) = @_; open(H,qq|>>c:\\temp\\foo.log|) or die qq|Cannot write to log: $!| +; print H $str . qq|\n|; close(H); }

Celebrate Intellectual Diversity

Replies are listed 'Best First'.
Re^2: Closing Connections using HTTP::Daemon
by Dr. Mu (Hermit) on Nov 11, 2005 at 23:28 UTC
    Wow, thanks for going to all that trouble! My DoPost and DoGet routines get pretty involved, so I omitted them, preferring not to scare off any help that might otherwise be forthcoming -- and to avoid the inevitable grief for not using the CGI module. :-)

    The only possibly relevent factor is that the POST routine is able to change some shared variables -- notably %Admin and %User. The GET routines have non-writing access to these variables as well. By all indications, the sharing works flawlessly. I'm even able to switch ports midstream without a glitch.

    To test the code, I created a self-refreshing HTML page and loaded it into two separate windows in my browser (Opera), just to keep the server busy. I then manually caused some POSTs to be sent, using a page with <form>s. The crashes usually occurred a couple GETs after a POST, but always in near proximity to about the fifth or sixth POST. Oddly enough, even after crashing, when Windows reports that it has to terminate perl.exe, the server keeps servicing requests! This keeps up until the next POST, whereupon the house of cards finally collapses.

    Opera will continue to reuse an open connection -- even sharing it among different windows -- until it's required to issue a POST. Then it will close one of its connections and open a new one to send the POST, and continue to share that connection with other windows. So my feeling is that the POST itself is a red herring and that it has more to do with connections closing and/or threads exiting.

    But the mystery remains as to why the connection stays active (->connected) once Opera abandons it.

Re^2: Closing Connections using HTTP::Daemon
by Dr. Mu (Hermit) on Nov 11, 2005 at 23:41 UTC
    re: your update. I needed to use the select method, since accept was blocking, preventing other threads from getting time. (There's also a client thread that gathers RSS feeds, and another that manages some serial ports.)