in reply to processing huge files

This looks like it may be a database question you just haven't asked yet and to which the answer is likely to be database-dependent.

For example, if you were inserting into Sybase, then all other things being horribly equal, you would be

1) building a huge uncheckpointed transaction log and

2) locking database resources on a grand scale

3) creating a single transaction of gigantic proportions.

Any one of which even taken alone could be causing such symptoms.

If the database isn't Sybase, the same cause may be happening conceptually, but will demonstrate different database-dependent symptoms.

Please may we have the rest of the loop plus the identity and version of the RDBMS.

One world, one people

Replies are listed 'Best First'.
Re^2: processing huge files
by geektron (Curate) on Aug 02, 2005 at 14:04 UTC
    the rest of the loop is there, as well as the "secret identity" of the RDBMS. ( MySQL 4.1.10 )
      In this case, the mysql LOAD DATA syntax command appears to be the way to go as another responder has already suggested. The replies that suggest using perl classes are missing the point that the database just can't take all those inserts - that's why databases come with bulk loading facilities.

      One world, one people

        well, that's the direction i'm going, it seems. i tried a few other things that *might* have freed up DB resources, but after about 20 minutes of running the job, the memory usage spiralled out of control.