in reply to Code Efficiency

If you have 10000 files in a directory, you may also be running into problems with the OS being able to handle directories that large. See my writup on this on a different thread.

Depending on the OS, you are better off (from a memory / OS standpoint) making your directory structure deeper, so that when the OS is opening the file, it does not need to read more than necessary. The wider / flatter the directory structure, the more resources may be necessary to get to an individual file (on open() for example).

--MidLifeXis

Replies are listed 'Best First'.
Re: Re: Code Efficiency
by fourmi (Scribe) on Mar 26, 2004 at 11:15 UTC
    Hi,
    yeah i definately found that windows tends to get screwy once a dir has more than 30,000 files in it, hence splitting into smaller chunks, even then it's still fairly screwy, that it a VERY interesting thread you link to, thanks very much, i didn't catch it initially.
    cheers!