Wouldn't a shallow copy by faster for large arrays, since the data isn't actually being copied?
"Shallow" in this case refers to the fact that the elements of the array would be copied, hence it'd be slow and burn memory for large arrays, but if any of those elements are references to other data structures, then copying of those references means that the copies of the references in the new array still refer to the same data structures.
use warnings; use strict; use Data::Dump; my %animals = ( cat => [ 1,2,3, ["nested","array"] ] ); my @cat = @{$animals{'cat'}}; push @cat, 4; $cat[1] = 9; $cat[3][0] = 'Nested'; dd \%animals; dd \@cat; __END__ { cat => [1, 2, 3, ["Nested", "array"]] } [1, 9, 3, ["Nested", "array"], 4]
Note how the 9 and 4 only affected @cat, while the modification of "Nested" is visible via both %animals and @cat. ["nested","array"] is a single anonymous array stored somewhere in memory, if you copy a reference to that array, the copy of that reference still points to the same anonymous array. A deep copy can be achieved with, for example, dclone from Storable (often used because it's a core module), or e.g. Clone from CPAN.
And "changes to the array"... oh, right, as opposed to changes to the elements of the array?
Both!
Update: I should add that the distinction between shallow and deep copies usually bites people in that they write my %copy = %data; and want a deep copy instead of a shallow one. But if you say that your copies are meant to be read-only (and the arrays are small), then indeed a shallow copy doesn't really hurt - though I still would recommend using references instead of shallow copies.
In reply to Re^6: Referencing the locals
by haukex
in thread Referencing the locals
by Chuma
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |