In case the problem is memory-consumption + thrashing, it may be worthwhile to batch-process your rows with a LIMIT-clause to your SQL-statement (and repeated execution, of course). Just get n rows at a time and process those, before going on with the next batch. I guess n should be 1000 or 2500 or so (try and err ;)
Or use fetchrow_arrayref() and get your row processing out of the way immediately.
(Btw, have you found out anything yet about where the bottleneck is?)
In reply to Re: DBI speed up needed on MySQL
by erix
in thread DBI speed up needed on MySQL
by jacques
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |