"The code lacks a user defined option for handling duplicates and is likely buggy."
Additionally, ikegami pointed out a poor API choice with an aesthetic suggestion on fixing it. I had already lost interest but recently had a need for this and found out that it was in fact very buggy (2 bugs corrected and commented below).sub gen_merger { my ($list, $fetch, $compare, $finish) = @_; my @item = map $fetch->($_), @$list; my $done; return sub { return $finish if $done; my $idx = first {$item[$_] ne $finish} 0 .. $#item; my $next = $item[$idx]; for ($idx + 1 .. $#item) { next if $item[$_] eq $finish; my $result = $compare->($next, $item[$_]); #$next = $item[$_] if $result == 1; # Need to keep track of which one we use not just value ($idx, $next) = ($_, $item[$_]) if $result == 1; } $item[$idx] = $fetch->($list->[$idx]); #$done = 1 if ! first {$item[$_] ne $finish} $idx .. $#item; # First element of array is 0 so use defined instead of truth $done = 1 if ! defined first {$item[$_] ne $finish} $idx .. $# +item; return $next; }; }
Like then, I was being lazy. I would normally lean on sort -m but I need a custom compare routine and was going to throw away the code after the file was merged. After running for a few minutes, it complained "Out of memory!". I had already wasted enough time and just wrote a procedural version but as I look at the code, I can't figure out where the memory leak is. In case you are thinking it might be in the calling code, I used it exactly as the file handle example.
In case it matters, I was using the stock perl 5.8.2 that ships with AIX 5.3.
Cheers - L~R
In reply to Help Diagnosing Memory Leak by Limbic~Region
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |