in reply to Re: need help debugging perl script killed by SIGKILL
in thread need help debugging perl script killed by SIGKILL

Thanks for the reply. I modified my $queue->enqueue() operation to use a shared variable, but the same problem still occurs.

When you stated "your data going into the queue must be declared :shared ", How do I do that ? I have been google searching and have not found anything on how to accomplish this for enqueue operations

  • Comment on Re^2: need help debugging perl script killed by SIGKILL

Replies are listed 'Best First'.
Re^3: need help debugging perl script killed by SIGKILL
by bliako (Abbot) on Mar 02, 2021 at 19:12 UTC

    The link I posted peripherally shows how to enqueue a blessed hash (object). And threads::shared has an example on how to create a shared hash which contains other shared items.

    use threads; use threads::shared; use Thread::Queue; my %hash; share(%hash); my $scalar; share($scalar); my @arr; share(@arr); # or # my (%hash, $scalar, @arr) : shared; $scalar = "abc"; $hash{'one'} = $scalar; $hash{'two'} = 'xyz'; $arr[0] = 1; $arr[1] = 2; $hash{'three'} = \@arr; my $q = Thread::Queue->new(); # A new empty queue $q->enqueue(\%hash);

    The "pitfall" I had in mind is this:

    my $hugedata = <BIGSLURP>; my (%hash) : shared; %hash = process($hugedata); # perhaps filtering or rearranging it into + a hash # $hugedata = undef; # <<< if done with it, then unload it, otherwise +... threads->create(...); # ... hugedata is duplicated, %hash is not.

    Memory is not the only reason the kernel can kill your process, perhaps "too many" threads will have the same effect. So, you should also find out the exact messages in /var/log/messages and the output of dmesg as Fletch suggested. Additionally, you must measure memory usage exactly (as opposed to just observing a SIGKILL maybe because of memory). If you are in some sort of *nix that's easy.

    bw, bliako