Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

Re: Muy Large File

by perlfan (Vicar)
on Mar 14, 2005 at 19:24 UTC ( [id://439406]=note: print w/replies, xml ) Need Help??


in reply to Muy Large File

I have a single file that ranges daily from 45-50Gig on a Solaris 8 server with 16G ram and 8 900Mhz cpu's.

How can you not be taking advantage of that horsepower?

I would seriously look into MPICH's implementation ROMIO http://www-unix.mcs.anl.gov/romio/ (MPI Standard 2.0) http://www.mpi-forum.org/docs/mpi-20-html/node171.htm#Node171

If it has to be Perl, then I would certainly look into parallelizing this application - as a brute force approach, split the file 8 ways, run a process to take of each piece, then join the darn things back together. Even with the splitting and rejoining, I am sure it would be faster than what is happening right now.

Replies are listed 'Best First'.
Re^2: Muy Large File
by crenz (Priest) on Mar 15, 2005 at 11:41 UTC
    Since he needs it to be in-place, I suggest 8 processes that work on the same file at the same time... if that's possible on Solaris. YOu can use seek for that.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://439406]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (4)
As of 2024-04-25 13:27 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found