Storing 100K files in a directory is sure to tax any filesystem. You would have to sub-divise the directory into multiple levels à la CPAN in order to make it manageable.Putting the images into a database would free you of that burden, at the price of serving the files somewhat slower. After all, serving files is what a filesystem is made to do best, but sometimes you have to take it by the hand and guide it.
CountZero A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James
| [reply] |
I don't know how much usage you're going to be seeing (how many requests per sec, etc.), but based on the general information you've mentioned, I'd probably do the following:
- Keep the images in the filesystem
- Store the images in a series of hashed directories
- Maintain a table of where the locations of the files are and other metadata.
- Turn off .htaccess
- Turn off directory listings for the whole server.
.htaccess is useful for when there's distributed control of the system (different people control different directories), but it's an unnecessary overhead on a dedicated server. It's especially problematic for deep directory structures, which hashing would create.
| [reply] |