in reply to Re: A story of a Perl Monk and Problem
in thread A story of a Perl Monk and Problem

But wouldn't it still need to go through the directory on every request from the web? I still think an RDBMS would be better. You'd only loop through the files when you create the table, and the memory requirements would be minimal, beyond the mySQL daemon running.

After you had your table with filenames, you would then only select the few filenames you needed to create each index page. The list of files is already prepared and stored in the table, there's minimal extra stuff involved to give a user a page.

  • Comment on Re: Re: A story of a Perl Monk and Problem

Replies are listed 'Best First'.
Re: Re: Re: A story of a Perl Monk and Problem
by Brovnik (Hermit) on May 19, 2001 at 22:09 UTC
    Yes, it would. This falls into the "If I were trying to get there, I wouldn't start from here category", but I was answering the specific point about "how do I use seek(POS)" rather than the broader "how do I present 90,000 files to the user".

    I agree with thpfft trying to present them all to the user isn't the way, and a search would be much better.

    Unless the filenames are descriptive (and this is difficult if they are in 8.3 notation), the search needs to be on some content or keywords related to the file as well, so you really should have some sort of persistent Database interface to the directory.
    --
    Brovnik.

    Edit: chipmunk 2001-05-19