The problem is that there can be hundreds of original senders. Having that many filehandles open is certain to be problematic....Should I just risk opening a zillion filehandles?
On what do you base you supposition that having lots of file descriptors open is a problem? What risks do you perceive? You assert but do you test. By default on Win2K you can have 509, on Linux 1021. 3 handles are used for STDIN, STDOUT, STDERR, so there are 512 and 1024 handles respectively available.
C:\tmp>perl -e "open ++$fh, '>', $fh or die qq'$fh $!\n' for 1..$ARGV[ +0]" 512 510 [root@devel3 tmp]# perl -e 'open ++$fh, ">", $fh or die "$fh $!\n" for + 1..$ARGV[0]' 1024 1022 Too many open files
But so what? Just increase the number if you need to. On Linux:
[root@devel3 tmp]# ulimit -n 65535 [root@devel3 tmp]# perl -e 'open ++$fh, ">", $fh or die "$fh $!\n" for + 1..$ARGV[0]' 2048 [root@devel3 tmp]# ls 204? 2040 2041 2042 2043 2044 2045 2046 2047 2048 [root@devel3 tmp]#
It is not actually the number of open file handles that will cause an issue. Depending on the underlying file system you will start to get issues if you go over10-20,000 files in a single direcotry with ext2/3. Reiser FS does not care.
cheers
tachyon
In reply to Re: Parse data into large number of output files.
by tachyon
in thread Parse data into large number of output files.
by Rhys
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |