All I do on this file is append new records and authenticate user logins. Still, on many scripts I load the full file into lists (arrays) for as to process it.
My question is...
Do you really need to pull all the records in at once? Do you need to examine all the records simultaneously before coming to a decision, or do you need to perform two or more passes through the file before coming to a decision?
Honestly, I doubt that is the case. I'd bet a beer or two that you could get by with a sequential scan, which means you wouldn't have to load everything into an array. i.e.:
open IN, 'db.txt' or die "cannot open db.txt for input: $!\n";
while( <IN> ) {
chomp;
my @fields = split '|';
...
}
close IN;
If you need to sort the "database", use the operating system's sort command (assuming you're on anything but Win32). If you need to perform multiple passes, just seek back to the beginning of the file and read it again, and rely on the fact that the OS will probably have buffered it in RAM.
If you can avoid pulling everything in to RAM as much as possible, you'll be able to live with a flat file all that much longer. Flat files are cool: you can debug them with vi!
And the day you need to get a real database, don't just think that means MySQL. Postgres has a lot going for it (and the version 7 that has been out for a few years now is a hell of a lot easier to install than previous versions).
-- g r i n d e r
just another bofh
print@_{sort keys %_},$/if%_=split//,'= & *a?b:e\f/h^h!j+n,o@o;r$s-t%t#u';
|