in reply to Slow script: DBI, MySQL, or just too much data?

AFAIK, (which isn't a vast amount...) the SELECT you're doing already *is* doing a join, but on the sql parser side internal to mysql.

Anytime you find yourself selecting something just so you can feed it back to the db as parameters, it's a good bet you're making perl do the work that should be done by the db. So, yes, you'd probably be much better off if you used a more complex DELETE statement.

--
"This alcoholism thing, I think it's just clever propaganda produced by people who want you to buy more bottled water." -- pedestrianwolf

  • Comment on Re: Slow script: DBI, MySQL, or just too much data?

Replies are listed 'Best First'.
Re^2: Slow script: DBI, MySQL, or just too much data?
by menolly (Hermit) on Apr 15, 2005 at 18:20 UTC
    Yes, there are some simple joins already, but I'd have to replace, for example:
    delete from testDBRepCats where repCatID = ?
    with something like:
    delete r.* from testDBRepCats r, testDBToMCQDB m, testDB t, clients c +where r.repCatID = m.repCatID and m.testDigest = t.testDigest and t.c +lientID = c.clientID and c.drone != '$drone'
    (not checked for validity)
    except that's much more complex to chunk (for time-out purposes), because just adding limit 1000 won't maintain data integrity the way the current approach does -- I'm processing 1000 tests at a time, but all the data for those tests is removed before I check time and move to the next chunk, so I'm left with something like:
    delete r.* from testDBRepCats r, testDBToMCQDB m, testDB t, clients c +where r.repCatID = m.repCatID and m.testDigest = t.testDigest and t.t +estDigest = ?
    passed to execute_array(), which may or may not be any gain over the current approach, since it's a more complex statement, involving a lot more tables.