in reply to Re: Issue with cloning and large structure processing
in thread Issue with cloning and large structure processing

But I need these copies. Each copy is different. I'm copying some element of AoH, changing some values in hash, and putting that changed element in array again.
  • Comment on Re^2: Issue with cloning and large structure processing

Replies are listed 'Best First'.
Re^3: Issue with cloning and large structure processing
by zwon (Abbot) on Apr 10, 2010 at 10:17 UTC

    I have no doubt that you need these copies. But why do you expect that multiple copies would use the same amount of memory as a single structure? After each loop iteration your array contains one more element, so its size is bigger

    use 5.010; use strict; use warnings; use Devel::Size qw(total_size); use Storable; my %hash = ( a => 1, b => 2, c => 3); say "Size of hash is: ", total_size(\%hash); my $ref; for (1..100_000) { push @$ref, Storable::dclone(\%hash); } say "Size of array is: ", total_size($ref); __END__ Size of hash is: 313 Size of array is: 31448769