open (my $handle,"gunzip -c $filename |" ) or die $!;
Liz | [reply] [d/l] |
generally is this good idea in at all, what do you think people?
At some point, you have to accept the fact that flat-files are a very inefficient database format, but yes, compression can help quite a bit. I added compression support to CGI::Search in version 0.5, and some simple benchmarks I ran against one of our real-world flat-file databases showed a 12-fold increase in speed. I don't think I have the benchmarking code around anymore, though, sorry. YMMV, especially since compression rates aren't guarenteed.
---- I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
-- Schemer
: () { :|:& };:
Note: All code is untested, unless otherwise stated
| [reply] [d/l] |
The first thought that comes to my mind is make your own custom binary format. For instance maybe separate your flat database into A..Z and compress each section individually. Or maybe place some sort of "binary markers" in your compressed file, then search for it and only open that "chunk". Or... maybe keep a second index file, detailing where in the main file, certain sections begin and end, and just SEEK them out.
You could get one of the compression modules off of cpan, and
compress each record individually, then cat them together with a separator in between them. Basically you will end up with something like DBM.
| [reply] |