Result:#!/usr/bin/perl use strict; use warnings; use Benchmark qw/cmpthese/; use PDL::LiteF; my @data = ( [ 0, 3, 2, 1 ], [ 1, 11, 1, 2 ], [ 5, -2, 0, 1 ], ); # It's not fair to make the conversion every time. my $pdldata = pdl @data; sub using_array { my @data = @_; my @sums; for my $i ( 0 .. $#data ) { $sums[0] += $data[$i][0]; $sums[1] += $data[$i][1]; $sums[2] += $data[$i][2]; $sums[3] += $data[$i][3]; } $sums[$_] /= @data for 0 .. 3; return @sums; } sub using_pdl { my $pdldata = shift; $pdldata /= $pdldata->getdim(1); return $pdldata->transpose->sumover; } cmpthese( 100000, { 'Array-based' => sub { using_array(@data) }, 'PDL-based' => sub { using_pdl($pdldata) }, } );
Apparently, for a dataset of this size (3 by 4) it's not worth it to use PDL. The good thing though, is that it can be applied for a bidimensional piddle of an arbitrary size without modifications of the subroutine.Rate PDL-based Array-based PDL-based 36496/s -- -67% Array-based 111111/s 204% --
I suppose that PDL scales much better though, I've used it for multidimensional piddles of 1e7 elements with a 50-fold increase in speed over a traditional array-based implementation.
In reply to Re^2: Averaging Elements in Array of Array
by bruno
in thread Averaging Elements in Array of Array
by neversaint
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |