in reply to processing slows way down

Whenever you have a really large amount of data to insert into a table, it is much better and faster to use a loading utility that is native to the database server. (In the case of Oracle, this would be a tool called sqlload, or something similar to that -- I don't know the exact spelling of the tool name in Oracle 9i.)

It would be most effective to use a simple Perl script to create the text stream that will serve as input to the native dbms import tool, so that you have a chance to make sure that the input won't involve any really hazardous mistakes -- i.e. errors in the data that the dbms tool might find acceptable even though they are errors.

I haven't done a recent systematic benchmark of Perl/DBI vs. sqlload, but the last time I had occasion to compare the two, doing row inserts via Perl/DBI was orders of magnitude slower than using sqlload.

Replies are listed 'Best First'.
Re^2: processing slows way down
by Anonymous Monk on Jul 10, 2004 at 17:04 UTC
    Thank you for your suggestion.

    Yes we've used sql loader and it's worked fine for us doing straight loads.

    We thought however with the complexity of the user data, having to translate their data to valid database data, combine existing input and database data to form new database data and being be able to create descriptive edit reports of the load, perl had a little more flexibility than sql loader.
      We thought however with the complexity of the user data, having to translate their data to valid database data, combine existing input and database data to form new database data and being be able to create descriptive edit reports of the load, perl had a little more flexibility than sql loader.

      You might be best off using Perl to process your user input but replacing your insert statements with prints to an output file that can subsequently be processed by the SQL loader.

        That's an interesting idea, I'll give that a look into.

        Thank you.