in reply to Re: Basic Perl trumps DBI? Or my poor DB design?
in thread Basic Perl trumps DBI? Or my poor DB design?

If you have enough memory, you could write a daemon process that would hold the entire dataset in memory.

Ok, I'll bite. How can you do this? Is there a basic template/framework that just opens a socket and talks back and forth between whoever's on the other side? I used to do this in C decades ago (sheesh), but it's alien to me in perl. (Well, I'd rather use a package that facilitates this, if one exists, rather than write it using raw code from scratch.)

FWIW, my use of this would be extremely similar to the example posed by the originator of this thread. In my case, I have keywords associated with all my photos on my site, and I just want to optimize the way users search for images based on those keywords. I have the code all written and works fine, technically, but I hate the fact that the entire textfile-DB has to load/parse every time, just to spew out answers and the close up shop... and to do this for EVERY search. It seems that the best thing to do is have a daemon do the open, and then perform searches (based on perl hashes) for each query. Any potential drawback with this?

  • Comment on Writing a Perl Daemon (was: Basic Perl trumps DBI? Or my poor DB design?)

Replies are listed 'Best First'.
Re: Writing a Perl Daemon (was: Basic Perl trumps DBI? Or my poor DB design?)
by lhoward (Vicar) on Oct 26, 2004 at 13:05 UTC
    In your case I'd use something like a SQL database with DBIx::TextIndex or Plucene. The main reason I recommended a daemon for him is because it could be highly optimized for the types of relationships he's modeling.

    But if you're going down the persistent network daemon route, I'd start with something like Net::Daemon. Its a very nice framework to start building on.

    L