in reply to Issue with cloning and large structure processing
Your going to have to show us a bit more of your code--like where does %some_el come from?--because on the face of it, 130MB is too big for a hash constructed from 8MB of data.
This creates an 8MB file of keys and values, loads them into a hash, and the total size is just 12MB:
c:\test>perl -E"printf qq[%014d: %014d\n], $_, $_ for 1..262144" >junk +.dat c:\test>dir junk.dat 10/04/2010 11:06 8,388,608 junk.dat c:\test>perl -MDevel::Size=total_size -E"local$/; my %h = split ': ', <>; print total_size \%h;" junk.dat 12489744
Of course, if the 8MB contains more than just a flat hash structure, then the memory requirement will be more, but 10x more is stretching the imagination a bit. So, it probably comes down to what else you are doing in your code. Real code is always more likely to result in a resolution than pseudo-code.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Issue with cloning and large structure processing
by scathlock (Initiate) on Apr 10, 2010 at 11:02 UTC | |
by BrowserUk (Patriarch) on Apr 10, 2010 at 11:41 UTC | |
by scathlock (Initiate) on Apr 10, 2010 at 11:53 UTC | |
by BrowserUk (Patriarch) on Apr 10, 2010 at 13:06 UTC |