in reply to Re: How split big file depending of number of iterations
in thread How split big file depending of number of iterations
I'm not aware of any UNIX that will allow you to have 19200 files open simultaneously; it's possible the situation is different under Windows, but I doubt it. In a quick test, I can open 1020 files before my script dies with a Too many open files error.
Workarounds include storing the filename in the hash, opening the file before using it and closing it afterwards; do the same thing, but cache open filehandles, and upon receiving a Too many open files error close the least-recently-used filehandle, and jot down somewhere that it needs to be re-opened; append the data to the hash entry, itself, then go through all hash entries and print their contents to the appropriate file.
|
|---|