in reply to fetchall_arrayref DBI question ?

Depending on the database software you're using, you could try using cursors, that's guaranteed to keep the data on the DB server side. For example in Postgresql (DBD::Pg) you'd do something like this:

$dbh->do("DECLARE csr CURSOR WITH HOLD FOR SELECT * FROM large_table o +rder by id"); while (1) { my $sth = $dbh->prepare("fetch 100000 from csr"); $sth->execute; my $res = $sth->fetchall_arrayref({}); if ($res && @{$res}) { # batch process the 100k rows } else { # error or no more rows exit; } }

The code above would retrieve your table in 100k row chunks. If you're using Postgres and you need to deal with tables that do not fit in memory, but would like to process things reasonably fast, this is a good way to go.