in reply to Need directory scheme to store 400,000 images

A similar approach worked fine for me back in the old dark ages of DOS and 8.3 file names. FAT-16 dealt pretty badly with huge (meaining over 2000 files, I didn't have to deal with 400k files) subdirectories.

There probably IS a better way. But is it worth it?

I'd suggest you set up a simple isolation layer, so you can do

$fh=image_open("123456.jpg");
Then run some tests with; it should be easy to create a sample 400k file subtree with something like the right distribution of file sizes, if you have the file space.

If it's fast enough, then don't worry about it. If you have a performance issue, then worry about making it better.

Offhand, I suspect if you have a performance issue, it will come from the horrible impact this kind of structure will have on your disk caching, since accessing any given file is going to involve reading 4 directories then the file, and your images may not be accessed in any kind of cache-friendly order.

If it turns out the performance bites, you can change your open routine to use some other underlying structure without affecting the rest of the code...


Mike