in reply to Muy Large File

I have a single file that ranges daily from 45-50Gig on a Solaris 8 server with 16G ram and 8 900Mhz cpu's.

How can you not be taking advantage of that horsepower?

I would seriously look into MPICH's implementation ROMIO http://www-unix.mcs.anl.gov/romio/ (MPI Standard 2.0) http://www.mpi-forum.org/docs/mpi-20-html/node171.htm#Node171

If it has to be Perl, then I would certainly look into parallelizing this application - as a brute force approach, split the file 8 ways, run a process to take of each piece, then join the darn things back together. Even with the splitting and rejoining, I am sure it would be faster than what is happening right now.

Replies are listed 'Best First'.
Re^2: Muy Large File
by crenz (Priest) on Mar 15, 2005 at 11:41 UTC
    Since he needs it to be in-place, I suggest 8 processes that work on the same file at the same time... if that's possible on Solaris. YOu can use seek for that.