in reply to reading (caching?) large files
I use fairly large files myself (among others tab-del), and I first try to read them into memory at once. Sometimes that is not possible, and than I try to minimalize the data in a row-read-write approach. As long as you use the proper functions, OS will take care of caching.
In cases where I need large amount of data available for lookup/search/sort, I use the BerkeleyDB. Very nice performance.
Hope this helps,
Jeroen
"We are not alone"(FZ)
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: reading (caching?) large files
by perchance (Monk) on Jun 05, 2001 at 17:11 UTC | |
by jeroenes (Priest) on Jun 05, 2001 at 17:25 UTC |