perrin has asked for the wisdom of the Perl Monks concerning the following question:
After fiddling with the options for both, I can't find a way to make them agree. It looks like my only option might be to s{\\n}{\n}g on the way out of MySQL and then reverse that on the way back in after processing with Text::CSV_XS. That is likely to hurt performance, since I am using the getline() and print() methods in Text::CSV_XS and I would have to switch to reading and printing the lines myself. (Reading the lines and dealing with embedded newlines myself? Ick!)
If anyone has any suggestions, I would appreciate them. Here's a (reduced) sample of the code I'm using:
use Text::CSV_XS; use IO::Handle; use Carp qw(croak); my $file = $ARGV[0]; my $parser = Text::CSV_XS->new({ binary => 1, always_quote => 1, escape_char => '\\', #' help emacs }); # open input and output temp files open(my $in_fh, '<', $file) or croak("Unable to open input file $!"); my $line_num = 0; # Using while(1) here and exiting via last() so we can avoid testing t +o see # if it's the last line (with eof) more than once per line. while (not eof($in_fh)) { $line_num++; # parse the row and handle any errors my $in_values = $parser->getline($in_fh); croak("Unable to parse line $line_num of file $file: " . $parser-> +error_input()) if !$in_values; } close $in_fh or croak("Error closing file $file: $!");
And the SQL looks like this:
SELECT blah blah blah INTO OUTFILE ? FIELDS TERMINATED BY ',' ESCAPED BY '\\' ENCLOSED BY '"' LINES TERMINATED BY '\n'
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Text::CSV_XS and MySQL newline handling
by radiantmatrix (Parson) on Nov 17, 2005 at 19:03 UTC | |
by perrin (Chancellor) on Nov 17, 2005 at 20:52 UTC | |
|
Re: Text::CSV_XS and MySQL newline handling
by jZed (Prior) on Nov 17, 2005 at 21:12 UTC | |
by perrin (Chancellor) on Nov 17, 2005 at 21:42 UTC |