in reply to Re: Writing hashes as records to a CSV file
in thread Writing hashes as records to a CSV file
actually using slices and empty key lists is way faster
@H{keys %$_} = () for @$AoH;
DEMO:
OUTPUT:use v5.12; use warnings; use Test::More; use Benchmark qw/cmpthese/; my $AoH; for my $n_rec (1, 10,100,1000) { say; say "=== num of records is: ",$n_rec; $AoH = create_data(1,$n_rec); is_deeply( [sort &list_join], [sort &slice_join], ); cmpthese(-1, { 'list_join' => \&list_join, 'slice_join' => \&slice_join, } ); } done_testing; sub list_join { my %H; %H = (%H,%$_) for @$AoH; return keys %H; } sub slice_join { my %H; @H{keys %$_}=() for @$AoH; return keys %H; } sub create_data { my ( $density,$records ) = @_ ; my @AoH; push @AoH, { map { rand 100 <= $density ? ("$_" => $_) :() } "A".. +"ZZ" } for 1..$records; return \@AoH; }
__DATA__ === num of records is: 1 ok 1 Rate list_join slice_join list_join 238532/s -- -65% slice_join 682713/s 186% -- __DATA__ === num of records is: 10 ok 2 Rate list_join slice_join list_join 7819/s -- -93% slice_join 112993/s 1345% -- __DATA__ === num of records is: 100 ok 3 Rate list_join slice_join list_join 82.9/s -- -99% slice_join 8533/s 10195% -- __DATA__ === num of records is: 1000 ok 4 Rate list_join slice_join list_join 3.66/s -- -100% slice_join 1067/s 29072% -- 1..4
Cheers Rolf
(addicted to the Perl Programming Language :)
Wikisyntax for the Monastery
fixed bug in sorted tests
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^3: Writing hashes as records to a CSV file (joining keys with slices)
by Tux (Canon) on Dec 09, 2021 at 14:06 UTC | |
by choroba (Cardinal) on Dec 09, 2021 at 21:41 UTC | |
by Tux (Canon) on Dec 10, 2021 at 08:03 UTC | |
by LanX (Saint) on Dec 09, 2021 at 14:45 UTC |