Following up: Every time your 'select *' gets compiled, the DB has to query the metadata to expand the '*'. A smart database will cache this information, but the work still has to be done. If it's a lot of data and a slow link, a colleague and I found that using prepared statements (prepare_cached(), actually) reduced a database query that took more than a day (to move 24 hours' data) to one that took an hour or so.
There is another reason to use prepared queries: they are resistant to SQL injection attacks. I would post a link, but Perl Monks censored it. Google "Bobby Tables" for what I mean.
In reply to Re^2: dbi: moving big data among databases (prepared statements)
by Anonymous Monk
in thread dbi: moving big data among databases (prepared statements)
by leostereo
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |