in reply to dbi update statement is very slow

Have you actually checked the connect statement for errors? It seems quite likely that the program is telling the truth, and the server cannot be found. Do you have any other DBI using scripts running from this server?

--
Tommy
Too stupid to live.
Too stubborn to die.

Replies are listed 'Best First'.
Re: Re: dbi update statement is very slow
by ctaustin (Sexton) on Aug 14, 2002 at 16:17 UTC

    Database: Oracle (8.1.7.3)

    I am not performing a commit after the initial data load because I am inserting into a transaction level temporary table so a commit will delete the data.

    We have several DBI scripts that make calls to various oracle database, this specific instance included, so I know that everything works.

    Here is the sub routine that is doing the work:

    sub load_data { $dbh=DBI->connect("dbi:Oracle:$db","$luser","$pass",{ PrintError=>1,AutoCommit => 0 }) or &error("cannot connect: $DBI::errstr\n"); #open(INFILE,">$dir")||&error("Cant open $dir: $!"); open (FILE,$dir)||&error("Can't open $dir: $!"); while ($line=<FILE>) { chomp($line); ($id, $value)=split/\|/, $line; push (@edits,$line); } close FILE; $x=@edits; $i=0; while ($i<$x) { ($id,$value)=split/\|/, $edits[$i]; $q=qq{insert into convert.sql_edits values (\'$id\',\'$value\' +)}; $sth=$dbh->prepare($q); $sth->execute; push(@total,$sth->rows); $i++; } $xtotal=@total; $sth->finish; if ($type eq "other") { $table="convert.b"; } elsif ($type eq "input"){ $table="convert.i"; } elsif ($type eq "output"){ $table="convert.o"; } elsif ($type eq "standard"){ $table="convert."; } if ($field eq "p1h_secondary_dx") { $sql="update $table"."dx$proc a set $field=(select trim(edit_value +) from convert.sql_edits b where a.counter=b.id) where exists (select 1 from convert.sql_edits b where a.co +unter=b.id)"; } elsif ($field eq "p1h_admit_date"||$field eq "p1h_admit_dx"||$field eq + "p1h_admit_hour"||$field eq "p1h_admit_phys"|| $field eq "p1h_attending_phys"||$field eq "p1h_birth_date"||$field + eq "p1h_consulting_phys1" ||$field eq "p1h_consulting_phys2"||$field eq "p1h_consulting_phys +3"||$field eq "p1h_drg_hosp" ||$field eq "p1h_hosp_admit_source"||$field eq "p1h_hosp_admit_typ +e"||$field eq "p1h_hosp_admit_type" ||$field eq "p1h_hosp_disch_disposition"||$field eq "p1h_hosp_pat_ +type"||$field eq "p1h_hosp_patient_type" ||$field eq "p1h_hosp_payor_code"||$field eq "p1h_hosp_reimb_serv_ +code"||$field eq "p1h_hosp_service_code" ||$field eq "p1h_mdc_hosp"||$field eq "p1h_prev_disch_date"||$fiel +d eq "p1h_prin_dx"||$field eq "p1h_race" ||$field eq "p1h_readmit_flag"||$field eq "P1H_REFERRING_PHYS"||$f +ield eq "p1h_reimb_amt"||$field eq "p1h_sex_code" ||$field eq "p1h_site_of_reimb_calc"||$field eq "p1h_tissue_result +") { $sql="update $table"."hdr$proc a set $field=(select trim(edit_valu +e) from convert.sql_edits b where a.header_counter=b.id) where exists (select 1 from convert.sql_edits b where a.he +ader_counter=b.id)"; } elsif ($field eq "p1h_procedure_code"||$field eq "p1h_px_date"||$field + eq "p1h_surgeon"){ $sql="update $table"."epi$proc a set $field=(select trim(edit_valu +e) from convert.sql_edits b where a.counter=b.id) where exists (select 1 from convert.sql_edits b where a.co +unter=b.id)"; } elsif ($field eq "p2h_chg_amt"||$field eq "p2h_cpt_code"||$field eq "p +2h_ordering_phys_code"||$field eq "p2h_serv_unit_count" ||$field eq "p2h_service_date") { $sql="update $table"."det$proc a set $field=(select trim(edit_valu +e) from convert.sql_edits b where a.trailer_counter=b.id) where exists (select 1 from convert.sql_edits b where a.tr +ailer_counter=b.id)"; } $sth=$dbh->do("$sql")||&error("Unable to perform update statement:<br +/><br />$sql<br /><br />$DBI::errstr\n"); $updated=$sth->rows; $dbh->commit() || &error("Unable to commit changes made by the update +statement: <br /><br />$sql<br /><br />$DBI::errstr\n");; $sth->finish; print $co->header(); print "<html><head><title>Complete</title> </head> <body> <h3>Summary</h3> <blockquote> <h5>Upload</h5> <blockquote> The file <b>$the_file</b> was successfully uploaded.<br /> +<br /> The destination file <b>$dir</b> was created. <br /><br /> </blockquote> </blockquote> <blockquote> <h5>File</h5> <blockquote> $xtotal records were loaded into the convert.sql_edits + table<br /><br /> </blockquote> </blockquote> <blockquote> <h5>Updates</h5> <blockquote> The update statement run was:<br /><br /> $sql <br /><br /> A Total of $updated records were updated<br /><br /> </blockquote> </blockquote>"; system ("rm $dir"); $dbh->disconnect or warn "disconnect failed: $DBI::errstr\n"; print "You have been disconnected from the database. All data in conv +ert.sql_edits for your current session has been automatically deleted.<br /><br /> </body> </html>"; exit; } #end load_data

    Thanks.

      Found it!

      First, clean up all the preamble: The while loop at the top can be rewritten as:

      open (FILE,$dir) || &error("Can't open $dir: $!"); $sth=$dbh->prepare("insert into convert.sql_edits values (?, ?)"); while (<FILE>) { chomp; ($id, $value)=split/\|/; $sth->execute($id, $value); push @total, $sth->rows; } $sth->finish; close FILE;
      since you're not using any of the values read (other than via the database).

      The subsequent SQL statement building can be simplified, since you're using $type and $field to determine the table name, and the rest of the SQL statement doesn't vary.

      And having cleaned sheer volume of setup out of the way, and thus determined that it can't possibly be the cause of the problem, the only part left is:

      $sth=$dbh->do($sql) || &error("Unable to perform update statement:<br /><br />$sql<br />< +br />$DBI::errstr\n"); $updated=$sth->rows; $dbh->commit() || &error("Unable to commit changes made by the update statement: <b +r /><br />$sql<br /><br />$DBI::errstr\n");; $sth->finish;

      But the documentation says the $dbh->do returns a row count, not a statement handle. So there you go.

      Finally, why bother with the convert.sql_edits table anyway? For every record in the table being updated, get the counter value, look it up in @edits, and set the appropriate field. That'd probably save a lot of grief: you'd have one database call per record updated, rather than one per convert line, which may or may not be better, but they wouldn't all have to be deleted afterward, either.

      --
      Tommy
      Too stupid to live.
      Too stubborn to die.