in reply to How to remove duplicates from a large set of keys
What is the overall range, ie. highest and lowest values of your numbers?
If the range is something reasonable, you can store 1 bit/per number in the overall range and use vec to test/set those bits. If you need the check persistant, save the bitstring in a file.
For a range 0 to 1,000,000, you just need 1/4 MB and it will be very fast.
Even if your numbers are say 10 digit numbers, this still only requires 120MB in memory or on disk, and is still extremly fast.
If the range is greater than you can comforatably fit in memory, but within acceptable limits for a diskfile ( unsigned 32-bit ints requires .5 GB), then doing the math, reading/checking/setting/re-writing the appropriate bit&byte is still pretty quick.
The following code reads and writes individual bytes for every bit and can deal with checking a million (setting 600,000 of them) in just over 10 seconds. If they are already set, that time drops to 6.5 seconds.
You could try adding some buffering if your numbers tend to run in batches rather than being completely random, but the buffering logic tends to slow things in many cases. You could try ':byte' and read/seek/write instead of ':raw' and sys* and see if PerlIO buffering buys you anything.
Some crude code & timings:
#! perl -slw use strict; use Math::Random::MT qw[ rand ]; use Benchmark::Timer; our $N ||= 1_000_000; { ## Use closures to retain info between calls or make it an object. my( $filename, $fh, $len ) = 'uniqints.dat'; sub uniqPersistant { my( $no ) = @_; unless( $fh ) { ## Open an existing (or new first time) file; my $mode = -e( $filename ) ? '+<:raw' : '+>:raw'; open $fh, $mode, $filename or die "$filename ($mode) : $!"; ## Remember the length. $len = -s $filename; } ## Calculate byte offset and bit my( $oByte, $bit ) = ( int( $no / 8 ), ( $no % 8 ) ); ## Extend the file with nulls if writing a new highest value if( $len < $oByte ) { my $extend = ( $oByte - $len + 1 ); sysseek $fh, 0, 2; syswrite $fh, chr(0) x $extend; $len += $extend; ## Accumulate the extension } ## Read the byte my $byte; sysseek $fh, $oByte, 0; sysread $fh, $byte, 1; ## All done if the bit is set return 1 if vec $byte, $bit, 1; ## Otherwise set it, re-write teh byte and return false; vec( $byte, $bit, 1 ) = 1; sysseek $fh, $oByte, 0; syswrite $fh, $byte, 1; return; } } my @numbers = map{ int rand 1_000_000 } 1 .. $N; my $T = new Benchmark::Timer; for ( 1 .. 2 ) { my $count = 0; $T->start( "Pass $_" ); uniqPersistant( $_ ) or $count++ for @numbers; $T->stop( "Pass $_" ); print "During pass $_; $count numbers uniq numbers were previously + unset"; } $T->report; __END__ P:\test>429615 During pass 1; 632091 numbers uniq numbers were previously unset During pass 2; 0 numbers uniq numbers were previously unset 1 trial of Pass 1 ( 11.891s total), 11.891s/trial 1 trial of Pass 2 ( 6.469s total), 6.469s/trial P:\test>dir uniqints.dat Volume in drive P has no label. Volume Serial Number is BCCA-B4CC Directory of P:\test 10/02/2005 11:17 125,000 uniqints.dat 1 File(s) 125,000 bytes 0 Dir(s) 58,981,855,232 bytes free
|
|---|