in reply to Querying databases in a loop

Others have pointed out the problem with your DBI code. I just wanted to point out that you might be able to improve things by putting all of these dictionaries in one table and making a column called DICTIONARY_ID which would be a part of the primary key. That would allow you to use a single SQL statement with placeholders.

Replies are listed 'Best First'.
Re: Re: Querying databases in a loop
by schnarff (Acolyte) on May 14, 2002 at 00:33 UTC
    At first I thought the idea of putting all the words into one giant table with an identifier for which subset they came out of was a good idea. Then I tried it.

    Basically, MySQL suffers a *huge* performance hit with the large table (600K+ words) vs. the smaller table (350+K words); I think I ran up against some sort of limit in the software. For comparison's sake, though, identical queries took 3.31 seconds for the large table, and 0.76 seconds for the smaller one -- over 4 times as long.

    Since it's not possible -- or, as was well pointed out, practical -- to leave the table name as a variable, and since upon further review I only have 4 tables to deal with (I thought it was going to be more like 12, but I ruled a bunch of word subsets out), I'm just going to have 4 separate statements prepped up (and yes, one database handle *can* support multiple queries -- I've done it before).

    Thanks for your help anyway.

    Alex Kirk
      What kind of indexing do you have on the table? I think that the one table paradigm is the way to go. You'll have to throw an index or two on it, but no worries there.
      As Thor said, you probably just have an indexing problem on your table. I know that the more recent MySQL table types handle very large data sets well, so this should not be a problem.