We don't bite newbies here... much | |
PerlMonks |
Re: is perl the best tool for this job ?by jdtoronto (Prior) |
on Oct 19, 2003 at 17:56 UTC ( [id://300404]=note: print w/replies, xml ) | Need Help?? |
I'm with BrowserUK on this one. Unless you need the sophistication of PDL on this, and it wouldn't seem that you do, then the read it all in approach is really a good one.
One of my clients collects huge amounts of data, after it is about 6 months old the chance of any single record being required again is about 1 in 10^-6 per year. So we archive the data in pipe-delimited files (more storage efficient than CSV) and store it on an archive server. Files will be from 20 - 800mb each. When I took over managing their system about 4 years ago thery were using "find text in a file" from the Windows explorer to look into the archives. I asked how long it takes to search and they said, oh, between 16 and 48 hours! Oops! So I did a very similar thing to the suggestion from BrowserUK and found we could search the archive drive in 2 to 3 hours. The beauty is that I can run about 20 searches concurrently before we see serious degradation (it is a quad xeon box now running linux which is vastly faster than the w2k server it was on until about 12 months ago) and the operators have a little Tk window which sends a request to the server which does the search and sends them back an email when it is done. They love it! Totally of about a day's work. IMHO - this is a wonderful task for Perl.
In Section
Seekers of Perl Wisdom
|
|