in reply to Memory issue with cancer data (analogy)
Your analogy doesn't help.
You are constructing a 3-dimensional array, and you are running out of memory. Assuming $data[ 0..X ][ 0..Y ][ 0..Z ];
What we need from you is:
If sparse, the approximate density?
If one or more of X,Y & Z can run say 3000 .. 4000; or if instead of using every number between 0 ..m; you only use every 10th or 100th; then you can save substantial space by using a hash instead of an array for that dimension of the structure.
Could you build (say) all of $data[1][Y][Z] for X=1; calculate the stats; and then discard that before building all $data[2][Y][Z] for X=2?
Is it just a number? If so, how big will that number get?
If, for example, each element of the array held an integer < 255, the you can easily substitute a string for the 3 level arrays and save huge amounts of memory.
Eg. This constructs a 100x100x100 3d array of small integers which requires 33MB of memory.
@data = map[ map[ map int( rand 256), 0..99 ],0..99 ], 0..99;; print total_size \@data;; 33454784
This on the other hand construct 100x100x100 2D array of strings. It contains the exact same information, but it only requires 1.6MB:
@data = map[ map pack( 'C*', map int( rand 256), 0..99 ), 0..99 ], 0.. +99;; print total_size \@data;; 1614784
If you give us the information we ask for, we can almost certainly help you reduce your memory requirements.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Memory issue with cancer data (analogy)
by ZWcarp (Beadle) on Jul 25, 2013 at 20:17 UTC | |
by BrowserUk (Patriarch) on Jul 26, 2013 at 01:28 UTC |