in reply to Using variables in a DBI do queriy

Your placeholders do not match your number of fields. One way I avoid this is to always use the following syntax:

my $sth = $dbh->prepare("insert into foo SET source=?, name=?, date=?, + status=?"); $sth->execute('file', 'Ben marcata', '2001-05-04', '') or die $dbh->er +rstr;

Also note that you have a variable $values that you were wanting to use to include 2 different field values. This won't work as a single string, they must be passed separately.

Replies are listed 'Best First'.
Re^2: Using variables in a DBI do queriy
by vendion (Scribe) on May 04, 2011 at 19:47 UTC

    Yea the placeholders where a typo on my part, I'll admit that. If it helps clarify the situation I am wanting to reduce these queries down to one if possible:

    $dbh->do("insert into batchpay (source,contract, invoice, substatus, o +riginal,imported) values (?,?,?,?,?,?)", {}, "$dir-$filename", $data[ +1], $data[2], $status, $line, $datetime ); $dbh->do("insert into batchpay (source,transaction,contract,amount,app +lied,account,substatus,original,imported) values (?,?,?,?,?,?,?,?,?)" +, {}, "$dir-$filename", $data[1], $data[2], $data[3], $data[4], $data +[7], $status, $line, $datetime ); $dbh->do("insert into batchpay (source,contract,amount,tax,applied,nam +eonpayment,invoice,substatus,original,imported) values (?,?,?,?,?,?,? +,?,?,?)", {}, "$dir-$filename", $data[1], $data[2], $data[3], $data[5 +], $data[6], $data[7], $status, $line, $datetime ); $dbh->do("insert into batchpay (source,nameonpayment,account,amount,ap +plied,invoice,substatus,original,imported) values (?,?,?,?,?,?,?,?,?) +", {}, "$dir-$filename", $data[1], $data[2], $data[4], $data[5], $dat +a[6], $status, $line, $datetime );
    Everything is in a while loop (looping through files within a directory) and each one of those queries is for a different file.

    Below is snipplet of code as to what I am doing:

      If you're trying to group those 4 queries into a single one, then I would do build the query programmatically. There are a couple things you have to worry about, not all the columns are being initialized in every insert, and it would be very easy to assign a value to the wrong field.

      I would therefore create a list of hashes that contain field name -> value pairs for each record you're wanting to create. Then compile the list of all possible field names and build the single insert statement. The following code should work for the sample code you provided, but obviously this is untested

      my %common = ( source => "$dir-$filename", substatus => $status, original => $line, imported => $datetime, ); my @records = ( { contract => $data[1], invoice => $data[2], %common, }, { transaction => $data[1], contract => $data[2], amount => $data[3], applied => $data[4], account => $data[7], %common, }, { contract => $data[1], amount => $data[2], tax => $data[3], applied => $data[5], nameonpayment => $data[6], invoice => $data[7], %common, }, { nameonpayment => $data[1], account => $data[2], amount => $data[4], applied => $data[5], invoice => $data[6], %common, }, ); my @fields = do { my %seen; grep {! $seen{$_}++} map {keys %$_} @records; }; my $fields = join ',', @fields; my $placeholders = '(' . join(',', ('?') x @fields). ')'; my $all_placeholders = join ',', ($placeholders) x @records; my @values = map {@{$_}{@fields}} @records; $dbh->do("insert into batchpay ($fields) values $all_placeholders", {} +, @values) or die $dbh->errstr;

        This looks very interesting although I'm not quite sure I know what is going on in it, my major question is how would the hash key value pairs be correctly matched to the @data array is it set in the order that the files are read? so the first file read would match the:

        { contract => $data[1], invoice => $data[2], %common, },
        and so on?