Well, what was happening was a combination of bad designs, of which slurping in all the data was just one of them. Other factors that played an important role: use of a poor database, one that didn't upgrade 3.7 million row locks to a table lock, and the fact that application code was run on the database server. But what I fail to understand is why there is an application involved at all. From the pseudo-code I get the impression that all what's done is fetching all the data from a table, writing that data to a file, then loading it back into a (different?) database. Why not have the one database dump its table using its native tools?