Muy Large Fileby BuddhaLovesPerl (Sexton)
|on Mar 14, 2005 at 07:24 UTC||Need Help??|
BuddhaLovesPerl has asked for the wisdom of the Perl Monks concerning the following question:
Venerable greetings to the monks,
I have a single file that ranges daily from 45-50Gig on a Solaris 8 server with 16G ram and 8 900Mhz cpu's. The task is simply to replace specific control characters with a space. Due to space limitations, this transform must be conducted in-place on the original file. The records are fixed and ascii.
Using perl with a single loop/unpack and s///, this is taking over 4 hours. I am not sure how much longer this would take to complete as I killed it. The transform rate at kill time was about 400M per hour.
The sys admin wraith types don't want dev orc types running 24h (or more) for each file. Ergo, I need to find the fastest method possible to accomplish this.
2 questions if you please.
1) Without sounding like a cliche, what is the 'fastest' way to get this done? Is a combination of Sys::Mmap and forks the fastest ticket out of this hell?
2) With all the icing, could this ever be done in less than 12 hours?
A lagging japh,