Rather than writing the array to disk, having sort read it from disk and write the results back to disk, and then reading them in; you can save a couple of steps by piping the data directly to sort using a piped open, and let it write the results to disk for you to read back. The following processes 10 million elements in around 3 minutes on my machine.
#! perl -slw use strict; use Benchmark::Timer; our $N ||= 1e6; my $T = new Benchmark::Timer; my @array; push @array, sprintf "Test%07d", int rand 32767 for 1 .. $N; $T->start( 'sort -u' ); open my $pipe, "| u:sort -u > temp.tmp" or die $!; print $pipe $_ for @array; close $pipe; open my $in, '<', 'temp.tmp' or die $!; my $i = 0; $array[ $i++ ] = $_ while <$in>; close $in; $#array = $i - 1; ## Corrected, with thanks to [johngg] $T->stop( 'sort -u' ); $T->report; printf "Array now contains %d uniq elements\n", scalar @array; __END__ [ 8:17:47.37] c:\test>551753-2 -N=1e7 1 trial of sort -u (183.281s total) Array now contains 32768 uniq elements
In theory, you could use IPC::Run3 to avoid hitting the disk at all, but then you have the problem of needing storage for both the input and output at the same time. The code above reads the data back into the original array and truncates the leftovers, so avoiding any extra memory growth.
In reply to Re: Saving an array to a disk file
by BrowserUk
in thread Saving an array to a disk file
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |