perhaps it's the fact that you're building a structure out of our 100M file in RAM that's the bottleneck
Quite possible. Data takes much more space as Perl variables than as it does in a file. Especially since hash keys are strings.
use strict; use warnings; use Devel::Size qw( total_size ); my $file = join '', map pack('NN', @$_), ( [ 273, 1234 ], [ 273, 5678 ], [ 274, 1234 ], [ 275, 5678 ], [ 276, 1234 ], [ 277, 5678 ], [ 278, 1234 ], [ 278, 5678 ], ); my %ValArrayByID; while ($file =~ /(.{8})/g) { my ($ID, $Val) = unpack('NN', $1); push @{$ValArrayByID{$ID}}, $Val; } print("File size: ", length($file), " bytes\n"); print("Memory usage: ", total_size(\%ValArrayByID), " bytes\n");
File size: 64 bytes Memory usage: 922 bytes
In reply to Re^2: Unpacking small chucks of data quickly
by ikegami
in thread Unpacking small chucks of data quickly
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |