I'll start out with an appology for the off topic post. Reap me if I'm too off topic. I know of no other place to ask.
I'm running a moderate sized Perl/CGI script that I wrote on an Apache server. About 60% of the time, when it is spawned, it will leave a "defunct" process that eats up a large chunk of CPU cycles for about a second, to the point where it is causing problems for other applications on the same box.
I'm running with strict and warnings, I have cleaned up all the warnings and am properly exitting.
Any insight would be appreciated. Thanks!