'out of memory' when processing though data that should generate a 2 dimensional 'array' of 17120 elements wide and 8400 lines long.... XP Pro and 4 Gigs of RAM
A 17120x8400 array of integers requires 4.5GB of ram.
If your XP is running 32-bit, the most ram available to perl is 2GB and you will run out of memory.
If you are running 64-bit windows and a 64-bit perl, then you will move into swapping, and processing will get horribly slow.
Is it necessary for your algorithm that you build the entire data structure in memory?
Or could you produce your data one row at a time and then write it to disk before starting on the next? The example I posted above does this and only requires 9MB of ram.
If it is absolutely necessary to build the whole thing in memory before outputting it, then I would pack the integers in memory. An array of 8400 strings length 64k (17120*4-bytes) comes to only 1/2GB. Whilst unpacking/repacking the values to perform the required math would slow things down a little, it will still be much faster than alternatives.
In reply to Re^4: Handling HUGE amounts of data
by BrowserUk
in thread Handling HUGE amounts of data
by Dandello
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |