What kind of data is stored in the files? If it is structured data, it sounds like this information would be better stored in a database, rather than in all these files. That's what databases were created for -- store hundreds and thousands of records in an organized, optimized way for easy retrieval later.
Otherwise, you can easily have 1000 files in a directory without hurting performance. Grab the first 4 or 5 digits for your directory naming convention. That will make it easy to find the files when you need them. Don't make yourself traverse more than 1 directory if it's really not needed.
Comment on Re: brainteaser: splitting up a namespace evenly
They're image files, and my experiences with stuffing images into databases have all been negative. Thanks for the suggestion though.
It may be that more than 1000 files would not hurt performance much (this is Solaris), but it's just much easier to work with directories that have a reasonable number of files in them, i.e. don't break tools like ls.