in reply to DBI speed up needed on MySQL

In case the problem is memory-consumption + thrashing, it may be worthwhile to batch-process your rows with a LIMIT-clause to your SQL-statement (and repeated execution, of course). Just get n rows at a time and process those, before going on with the next batch. I guess n should be 1000 or 2500 or so (try and err ;)

Or use fetchrow_arrayref() and get your row processing out of the way immediately.

(Btw, have you found out anything yet about where the bottleneck is?)

Replies are listed 'Best First'.
Re^2: DBI speed up needed on MySQL
by jacques (Priest) on May 09, 2005 at 20:35 UTC
    That's what I ended up doing. Just sending stuff a little at a time with the limit clause, but this is not ideal. Hopefully, under normal circumstances, my script would not handle tens of thousands of rows at once. For important testing purposes, we decided to send over all the rows (and that's when we see the huge slowdown). So far, the bottleneck appears to be what thor suggested: mysql fetches all of the data before returning the first row. The more I increase the limit, the longer it takes to process the first row. ...