And those vendor-supplied tools are orders of magnitude faster (and usually hard to beat in terms of data validation) when compared to executing a series of "insert" statements from a Perl/DBI script. Also, they are pretty flexible: you can choose what suits you best in terms of record and field delimiters, and how the db server should behave while loading the data.
If the data files you are importing are in the hundreds (or maybe few thousands) of records, the speed difference might not count for much. But the fact that the tool already exists (and does excellent error trapping/handling) still makes it worthwhile.
For data sets of many thousands of records, you could be looking at a run-time difference of 10 to 1 or worse for Perl/DBI inserts vs. the server's native import tool (i.e. if mysqlimport takes 6 minutes, Perl/DBI inserts will take at least an hour).
It looks like you are opening a file for outputting cvs-style records, but you don't seem to ever write to that file handle. Let me suggest that you drop the idea of connecting to the database in this script, and just focus on writing a proper comma- (or better yet, tab-)delimited output file that puts the intended insertion data into a consistent set of plain-text rows.
Then read up on your db-server's import tool and work out how to feed it the data file that was written by your perl script. You'll finish your task a lot quicker that way, because in addition to the actual db load being a lot faster, the perl script will be a lot simpler to code, and will go very fast on its own.
In reply to Re: Parsing Script
by graff
in thread Parsing Script
by phimtau123
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |