in reply to Skip problematic lines while populating database

The ideal first step would be to use DBI placeholders instead of creating a string-SQL statement. The second step would be to use Text::CSV_XS for parsing your delimited input file.

my $sth_insert = $dbi->prepare(<<'SQL'); INSERT INTO biochemical VALUES (?,?,?,?,?,?,?,?,?,?,...,?) SQL for my $row (@rows) { $sth_insert->execute( @$row ); };

By using DBI placeholders, you don't need to care about special characters like double quotes in strings.

Then, when inserting, switch off the RaiseError flag:

local $sth_insert->{RaiseError} = 0; for my $row (@rows) { $sth_insert->execute( @$row ); };

That way, the script does not die anymore on insert errors.

Replies are listed 'Best First'.
Re^2: Skip problematic lines while populating database
by Tux (Canon) on Aug 17, 2020 at 15:39 UTC

    The last section of Text::CSV_XS' docs for dumping databases notes how to load special exports, e.g. where \N denotes NULL.

    You do not need to read the entire input datafile into memory, you can do it streaming:

    use Text::CSV_XS qw( csv ); # Make sure the field order matches the TSV order my $sth = $dbh->prepare (...); # See Corions post csv ( in => "file.tsv", out => undef, sep => "\t", on_in => sub { $sth->execute (@{$_[1]}) }, );

    Enjoy, Have FUN! H.Merijn
Re^2: Skip problematic lines while populating database
by AnomalousMonk (Archbishop) on Aug 17, 2020 at 15:55 UTC
    $sth_insert->execute( @$row );

    The only thing to look out for here is that there is a gap in the sequence of elements to be inserted. One might write instead (untested):
        $sth_insert->execute(@{ $row }[ 0 .. 4, 6 .. 25 ]);


    Give a man a fish:  <%-{-{-{-<

A reply falls below the community's threshold of quality. You may see it by logging in.