in reply to Getting/handling big files w/ perl
First -- Welcome!
Now, this is a guess, but it sounds like (from your description) the compressed files you need are static and the only dynamic file is the 'wget ... 0.5 GB file'. In my experience, syscalls become academic if the file size is larger than 2MB, so continue to use the syscalls. But if the 49 files are static, then forget about the gzip/gunzip steps and leave them as raw data files. With your current equipment that should be easy!
If all files are dynamic, then I would spend the time updating the network (if possible). GigE is inexpensive today also.
And I agree with the earlier suggestion to use the 'Devel::NYTProf' profiler to find any *real* bottlenecks.
Good Luck...Ed
"Well done is better than well said." - Benjamin Franklin
|
|---|