Currently I store up about 5000 lines of output in a hash, then write it out in append mode to however many files I accumulated. I doubt this is the best way to do this. Originally I had thought to maintain all the filehandles in an array but I believe I ran into OS limits that way.
Is there a better way to do something like this quickly?
We've recently switched from solaris to quad dual core Xeons running Linux with 16Gigs of RAM... it is ok if I utilize up to probably 75% of the system (maybe even a bit higher) for the biggest jobs.
In reply to Writing to many (>1000) files at once by suaveant
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |