In fact 20 million lines of data in flat files is a very solid indication it is time to put the data into a RDBMS.
On the contrary. Not putting them in a DB can make things _much_ more efficient. The real deciding factor is not the data load size, but the data access requirements and volatility of the data.
For instance I receive about 2 gigs worth of records every day that I have to process. Never in their life do these records see a DB. Just loading them into a DB and indexing them is signifigantly slower than the processing I need to do. And I only need to do that processing once (or very rarely twice). RDBMS are not IMO suitable for managing large volumes of data that are only going to be accessed once or twice, never are changed, and can be disposed of once processed.
Anyway, just thought Id throw that in there.
First they ignore you, then they laugh at you, then they fight you, then you win.
-- Gandhi
In reply to Re: Re: Force perl to release memory back to the operating system
by demerphq
in thread Force perl to release memory back to the operating system
by Roger
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |