in reply to Unique uniq implemenation

$ perl -e'while(1){printf"%d ",$i++;printf("field%d ",rand(20))for(0.. +rand(20));print"\n";}' >foo.dat <CTRL-C> $ head -4 foo.dat 0 field11 field6 field9 field14 field19 field11 field0 field2 field18 +field10 field4 field17 field17 1 field18 field12 field19 field9 field15 2 field3 field16 field12 field17 field2 field10 field10 field1 field1 +field10 field5 field1 field5 field11 field2 3 field11 field2 field19 field16 field19 field15 field6 field10 field2 + field7 field17 field8 field4 $ ls -l foo.dat -rw------- 1 notroot 14753792 Mar 6 13:13 foo.dat $ time perl -lane '$n=shift@F;$f="@F";push@{$d{$f}},$n;END{for(sort ke +ys %d){print"$_ -> @{$d{$_}}" if (@{$d{$_}}>1);}}' <foo.dat >foo.log real 21.1 user 18.9 sys 0.9 $ tail -4 foo.log field9 field9 field3 -> 50797 123541 field9 field9 field5 -> 25185 66389 134175 138790 field9 field9 field6 -> 8571 93213 field9 field9 field6 field2 -> 64192 151266

topped out at SIZE 60M RES 60M on my piddly 500MHz SunBlade.

Are you sure about your memory consumption? I grok bigdata daily with perl.