JPaul has asked for the wisdom of the Perl Monks concerning the following question:
I come to you today with a logistical problem rather than a programming one.
Firstly, I am doing this on a linux box.
I am working on a script which deals with very large arrays which will, if left to its own devices, fill up all the memory on a machine.
The most logical thing I can think of is a quick sub to swap in and out the contents of the array, so I'm not completely filling up the memory -- but the question is, for speed and efficiency, what should I swap the unused data out to?
Imagine a web spider that caches unchecked links as it goes by popping them onto an array. Leave it for a few hours/days, and eventually you're out of memory.
I figure what makes sense is keeping a number of entries in memory (say, 50,000), and popping all new 'URLs' onto an out-of-memory stack. When I have depleted my 50,000 URLs in my array, I swap 50,000 back in from the stack and carry on with those.
Making sense? Good. So -- how do I store the stack? A friend suggested using a RDBS, which would work - but could there be a faster way?
Thanks all,
JP
-- Alexander Widdlemouse undid his bellybutton and his bum dropped off --
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: How to best handle memory intensive operations
by bikeNomad (Priest) on Jul 26, 2001 at 23:59 UTC | |
by JPaul (Hermit) on Jul 27, 2001 at 00:59 UTC | |
by bikeNomad (Priest) on Jul 27, 2001 at 01:02 UTC | |
|
Re: How to best handle memory intensive operations
by John M. Dlugosz (Monsignor) on Jul 27, 2001 at 00:04 UTC |