Looks like used hashes (and, probably, arrays?) is slowdown some operations (like keys()), even if they was cleaned after using... and only way to workaround this slowdown is use undef() to clean up things.
I've not found description of this effect in docs, so maybe this is a bug ... or please point me to explanation of this feature in docs. :-)
Perl 5.8.2, Glibc 2.2.5, Linux 2.4.22
WBR, Alex.
#!/usr/bin/perl use Benchmark qw(:all); %h_delete = %h_list = %h_undef = (0 .. 50000); delete @h_delete{keys %h_delete}; %h_list = (); undef %h_undef; timethese 10000, { delete => '1 for keys %h_delete', list => '1 for keys %h_list', undef => '1 for keys %h_undef', }; __END__ Benchmark: timing 10000 iterations of delete, list, undef... delete: 5 wallclock secs ( 2.43 usr + 0.00 sys = 2.43 CPU) @ 41 +15.23/s (n=10000) list: 5 wallclock secs ( 2.44 usr + 0.00 sys = 2.44 CPU) @ 40 +98.36/s (n=10000) undef: 0 wallclock secs ( 0.02 usr + 0.00 sys = 0.02 CPU) @ 50 +0000.00/s (n=10000) (warning: too few iterations for a reliable count)
In reply to undef speedup ?! by powerman
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |