in reply to DBI: How to update 150K records efficiently
You might want to look into the documentation for your database, as there's likely a bulk-loading ability to handle this sort of replacement.
In Oracle, it's the command 'sqlldr'. For mysql, use the 'LOAD DATA LOCAL INFILE' command with the 'REPLACE' keyword. (note -- removing the LOCAL keyword will likely result in a failure on permissions).
There are also tricks that you can use when handling this size of table -- I copy the table, drop all indexes, load the data, then recreate the indexes and replace the original table. This saves the indexes needing to update for each insertion. With oracle, you can defer constraints, so they're done in mass. (but, if you screw up, you have to roll back the entire transaction).
For comparison ... I updated a mysql table this morning (6.2 million records), using 'LOAD DATA' in under 7.5 min:
mysql> LOAD DATA LOCAL INFILE '/tmp/HINODE_sizes.txt' REPLACE INTO TAB +LE hinode.files ( filepath, instrument, quicklook, mod_time, filesize + ); Query OK, 11884251 rows affected, 1 warning (7 min 26.99 sec) Records: 6264061 Deleted: 5620190 Skipped: 0 Warnings: 1
In your example, if your primary key is 'vc', you should be able to load it as a pipe delim file
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: DBI: How to update 150K records efficiently
by graff (Chancellor) on Apr 01, 2008 at 04:54 UTC | |
by jhourcle (Prior) on Apr 01, 2008 at 15:32 UTC |