in reply to Re: loading csv files into MySQL databse
in thread loading csv files into MySQL databse

LOAD DATA will do mass imports. My issue with it is that (I believe that) single errors will cause an entire import to fail.
Sometimes... If the error is in the data-output being inconsistent then maybe. If the error is because the row violates a pkid then again check out REPLACE / IGNORE clauses. PG does not yet have these options and they save a tremendous amount of time.

Do not ever split on \t for a tab separated file. I've done this many times, that solution is stupid and crufty. Set the sep_char in Text::CSV. All of what your doing now, less the chopping of white space can be done by Text::CSV. It will work faster (if it can make use of Text::CSV_XS) AND it will handle the edge cases which you aren't. There is even DBD::CSV (uses csv_sep_char) which will use Text::CSV_XS on the underside, and you can simple select from one table and insert it into SQL!! Zomfg, split is suicide here.


Evan Carroll
I hack for the ladies.
www.EvanCarroll.com
  • Comment on Re^2: loading csv files into MySQL databse