http://qs1969.pair.com?node_id=346122


in reply to performance problem with oracle dbd

I've done a bit of work with multi thousand row updates/inserts into oracle and have found the process of:
  1. spool out the data
  2. parse it with per
  3. sql load the data
to be by far the best way to do it.

If you're moving from mysql to oracle, i reckon this could be the best way.

One tip (i found the hard way) is not wrapping up everything into a single click operation. Seperate all the tasks into descrete scripts, number them like 1.export_mysql.sh, 2.parse_output.pl, 3.add_to_oracle.sh.

The advantage of unwrapping everything is if one part fails, it doesnt automatically roll on to the next one. No matter how good your checking is, there may always be one thing you've not taken into account.

When i use the above method, i find my outages are very well organised, and in the event i'm sick, someone else can easily perform the same outage with very little ramp up time.

Of course, most of this is OT, but may help none the less.