rhumbliner has asked for the wisdom of the Perl Monks concerning the following question:

i'm using mysqldump to dump the contents of a database into multiple text files:

mysqldump --tab=./ psites

one of my tables contains jpeg images in a mediumblob field. the problem i'm having is when i use perl to read the resulting files i find that the newline is not escaped so a single record is viewed by my perl script as multiple records.

this whole thing confuses me 'cuz mysqlimport loads the correct number of records. how mysqlimport can read the proper number of records but perl cannot is a mystery to me.

can someone enlighten me?

Replies are listed 'Best First'.
Re: mysqldump files -- SOLVED
by rhumbliner (Sexton) on Aug 24, 2010 at 23:26 UTC

    sigh, i misunderstood how mysql was escaping newline and tab. the following code properly reads the file output by mysqldump (although it may not be the most efficient):

    my $line = ''; while (<$in>) { $line .= $_; next if m/\\$/; chomp $line; my @cols = (''); my @tabs = split /\t/, $line; foreach (@tabs) { $cols[$#cols] =~ m/\\$/ and do { $cols[$#cols] .= "\t$_"; next; }; push @cols, $_; } shift @cols; <processing done here> $line = ''; }
Re: mysqldump files
by Anonymous Monk on Aug 24, 2010 at 21:31 UTC
    how mysqlimport can read the proper number of records but perl cannot is a mystery to me.

    Because you wrote the code, and you don't know what you're doing?