jimbus has asked for the wisdom of the Perl Monks concerning the following question:
hello
I'm having an issue with insertions/updates using DBI. With Blokhead's help (sorry for getting snippy, Blokhead, but I do try to look things up before I ask) I turned up tracing and I'm not understanding what I'm seeing.
This is the table:
| Field | Type | Null | Key | Default +---------+--------------+------+-----+------------------ | tstamp | timestamp | YES | PRI | CURRENT_TIMESTAMP | url | varchar(64) | | PRI | | average | double(17,1) | YES | | NULL | maximum | bigint(20) | YES | | NULL | total | double(17,0) | YES | | NULL
Here is the code in question
foreach $key (sort keys %urls) { my @tmpData = @{$urls{$key}}; print "\nDATA: @tmpData\n"; eval { $i2_stmt->execute(@tmpData[0],@tmpData[1],@tmpData[2],@tmpData[3], +@tmpData[4 ]); }; if ($@) { $u2_stmt->execute(@tmpData[2],@tmpData[4],@tmpData[4],@tmpData[3], +@tmpData[4],@tmpData[0],@tmpData[1]); } }
the general idea is to insert incoming data, but if a row already exists, update it instead. For test data I'm loading the same data twice, so in the trace output I should be seeing a nice flow of insert, fail, update, insert, fail, update... but I'm seeing insert, fail, update, insert, insert, insert, fail update, fail, exit.
DATA: 05-09-30 10:00 cnet.com 1.0000 1 2 -> execute for DBD::mysql::st (DBI::st=HASH(0x838fd60)~0x804d26c ' +05-09-30 1 0:00' 'cnet.com' '1.0000' '1' '2') -> dbd_st_execute for 0815d350 Binding parameters: insert tmp_urls (tstamp,url,average,maximum, +total) val ues ('05-09-30 10:00','cnet.com','1.0000','1','2') Duplicate entry '2005-09-30 10:00:00-cnet.com' for key 1 error 1062 re +corded: Du plicate entry '2005-09-30 10:00:00-cnet.com' for key 1 <- dbd_st_execute -2 rows !! ERROR: 1062 'Duplicate entry '2005-09-30 10:00:00-cnet.com' for + key 1' (e rr#0) <- execute= undef at coalData.pl line 68 via ./coalData.pl line 67 -> execute for DBD::mysql::st (DBI::st=HASH(0x83c3110)~0x838fe20 ' +1.0000' '2 ' '2' '1' '2' '05-09-30 10:00' 'cnet.com') -> dbd_st_execute for 0838fe14 Binding parameters: update tmp_urls set average = round((average +*total+'1. 0000'*'2')/(total+'2'),2), maximum = greatest(maximum,'1'),total = tot +al + '2' w here tstamp = '05-09-30 10:00' and url = 'cnet.com' <- dbd_st_execute 1 rows <- execute= 1 at coalData.pl line 70 DATA: 05-09-30 12:00 cnet.com 1.0000 1 3 -> execute for DBD::mysql::st (DBI::st=HASH(0x838fd60)~0x804d26c ' +05-09-30 1 2:00' 'cnet.com' '1.0000' '1' '3') -> dbd_st_execute for 0815d350 Binding parameters: insert tmp_urls (tstamp,url,average,maximum, +total) val ues ('05-09-30 12:00','cnet.com','1.0000','1','3') <- dbd_st_execute 1 rows <- execute= 1 at coalData.pl line 68 via ./coalData.pl line 67 DATA: 05-09-30 13:00 cnet.com 1.2000 2 6 -> execute for DBD::mysql::st (DBI::st=HASH(0x838fd60)~0x804d26c ' +05-09-30 1 3:00' 'cnet.com' '1.2000' '2' '6') -> dbd_st_execute for 0815d350 Binding parameters: insert tmp_urls (tstamp,url,average,maximum, +total) val ues ('05-09-30 13:00','cnet.com','1.2000','2','6') Duplicate entry '2005-09-30 13:00:00-cnet.com' for key 1 error 1062 re +corded: Du plicate entry '2005-09-30 13:00:00-cnet.com' for key 1 <- dbd_st_execute -2 rows !! ERROR: 1062 'Duplicate entry '2005-09-30 13:00:00-cnet.com' for + key 1' (e rr#0) <- execute= undef at coalData.pl line 68 via ./coalData.pl line 67 -> execute for DBD::mysql::st (DBI::st=HASH(0x83c3110)~0x838fe20 ' +1.2000' '6 ' '6' '2' '6' '05-09-30 13:00' 'cnet.com') -> dbd_st_execute for 0838fe14 Binding parameters: update tmp_urls set average = round((average +*total+'1. 2000'*'6')/(total+'6'),2), maximum = greatest(maximum,'2'),total = tot +al + '6' w here tstamp = '05-09-30 13:00' and url = 'cnet.com' Duplicate entry '2005-10-13 15:58:42-cnet.com' for key 1 error 1062 re +corded: Du plicate entry '2005-10-13 15:58:42-cnet.com' for key 1 <- dbd_st_execute -2 rows !! ERROR: 1062 'Duplicate entry '2005-10-13 15:58:42-cnet.com' for + key 1' (e rr#0) <- execute= undef at coalData.pl line 70 DBD::mysql::st execute failed: Duplicate entry '2005-10-13 15:58:42-cn +et.com' fo r key 1 at ./coalData.pl line 72. -> DESTROY for DBD::mysql::st (DBI::st=HASH(0x838fe20)~INNER) ERROR: 1062 'Duplicate entry '2005-10-13 15:58:42-cnet.com' for + key 1' (e rr#0) <- DESTROY= undef -> DESTROY for DBD::mysql::st (DBI::st=HASH(0x804d26c)~INNER) ERROR: 1062 'Duplicate entry '2005-10-13 15:58:42-cnet.com' for + key 1' (e rr#0) <- DESTROY= undef -> DESTROY for DBD::mysql::st (DBI::st=HASH(0x804d254)~INNER) ERROR: 1062 'Duplicate entry '2005-10-13 15:58:42-cnet.com' for + key 1' (e rr#0) <- DESTROY= undef -> DESTROY for DBD::mysql::db (DBI::db=HASH(0x838fb20)~INNER) &imp_dbh->mysql: 806405c ERROR: 1062 'Duplicate entry '2005-10-13 15:58:42-cnet.com' for + key 1' (e rr#0) <- DESTROY= undef
The DATA lines are a print line of the input data. I'm also seeing fresh lines being inserted with current time... which might indicate I'm seeing null timestaps... but I don't see that in the DATA lines or the binds to the update
thanks
Jimbus
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: DBI strangeness
by pg (Canon) on Oct 14, 2005 at 01:46 UTC | |
by jimbus (Friar) on Oct 14, 2005 at 16:36 UTC | |
by poj (Abbot) on Oct 14, 2005 at 17:49 UTC | |
by pajout (Curate) on Oct 14, 2005 at 07:47 UTC | |
by jimbus (Friar) on Oct 14, 2005 at 16:42 UTC | |
Re: DBI strangeness
by jimbus (Friar) on Oct 14, 2005 at 18:15 UTC |