andreas1234567 has asked for the wisdom of the Perl Monks concerning the following question:

Update Fri May 30 08:29:59 CEST 2008: I have been fooled by MySQL's Type Conversion in Expression Evaluation:
Comparisons that use floating-point numbers (or values that are converted to floating-point numbers) are approximate because such numbers are inexact. This might lead to results that appear inconsistent:
mysql> SELECT '18015376320243458' = 18015376320243458; -> 1 mysql> SELECT '18015376320243459' = 18015376320243459; -> 0
This underlines the importance of using
$sth->bind_param(1, $n, { TYPE => SQL_INTEGER });
rather than
$sth->bind_param(1, $n);
when $n is a number, otherwise the value will be treated as a string and there may be conversion errors.

Update Wed May 28 18:35:30 CEST 2008: This is most likely a bug in MySQL server on x86_64.

The environment:

perl, v5.8.8 built for x86_64-linux-thread-multi DBI is up to date (1.603). DBD::mysql is up to date (4.006). MySQL Server version: 5.0.33 Source distribution Linux 2.6.9-42.ELsmp x86_64 x86_64 x86_64 GNU/Linux
I have a MySQL table defined as below, intended for 19-digit account numbers taken from a financial institution.
create table foo(i decimal(23,0) primary key);
I have noticed that for the first of the two following statements, the subsequent $sth->execute() will return 0 rows for certain 19-digit values of $n, while the second will return 1 row:
$sth->bind_param(1, $n); $sth->bind_param(1, $n, { TYPE => SQL_INTEGER });
Interestingly, the difference disappears when the MySQL table is defined as
create table foo(i bigint primary key);
Note that bigint can hold values up to 9223372036854775807 only (and thus useless to me since I cannot guarantee the values to be less than this limit). The test script:
use strict; use warnings; use Test::More; use Data::Dumper; use DBI qw(:sql_types); use ExtUtils::Installed; my %modules = map { $_ => 1 } qw (DBI DBD::mysql); my $instmod = ExtUtils::Installed->new(); foreach my $m ($instmod->modules()) { diag "$m: " . $instmod->version($m) . "\n" if $modules{$m}; } my $dbh = undef; sub delete_all { my $sth = $dbh->prepare(q{delete from foo}) or die "prepare failed"; my $retval = $sth->execute() or die "execute failed"; return 1; } sub insert_range { my ($n, $i) = @_; my $sth = $dbh->prepare(q{insert into foo values (?)}) or die "prepare failed"; my $count = 0; foreach my $j (1 .. $i) { my $retval = $sth->execute($n + $j) or die "execute failed"; $count++; } return $count; } sub dynamic_sql { my $n = shift; my $sth = $dbh->prepare(qq{select i from foo where i=$n}) or die "prepare failed"; my $retval = $sth->execute() or die "execute failed"; my $count = 0; $count++ while (my $row = $sth->fetchrow_arrayref()); return $count; } sub bindparams_no_sql_type { my $n = shift; my $sth = $dbh->prepare(q{select i from foo where i=?}) or die "prepare failed"; $sth->bind_param(1, $n); my $retval = $sth->execute() or die "execute failed"; my $count = 0; $count++ while (my $row = $sth->fetchrow_arrayref()); return $count; } sub bindparams_sql_type { my $n = shift; my $sth = $dbh->prepare(q{select i from foo where i=?}) or die "prepare failed"; $sth->bind_param(1, $n, { TYPE => SQL_INTEGER }); my $retval = $sth->execute() or die "execute failed"; my $count = 0; $count++ while (my $row = $sth->fetchrow_arrayref()); return $count; } # ------ main ------ my @args = qw (DBI:mysql:test foo bar); $dbh = DBI->connect(@args) or die "connect failed"; my @data = (); my $start = 6700000000000000000; my $range = 1000; plan tests => 2 + 3 * $range; # ------ tests ------ cmp_ok(delete_all(), q{==}, 1, q{Expect delete_all to return 1}); cmp_ok(insert_range($start, $range), q[==], $range, qq[Expect $range insert(s)]); for (1 .. $range) { my $i = $start + $_; cmp_ok(dynamic_sql($i), q{==}, 1, qq{Expect 1 row from dynamic_sql($ +i)}); cmp_ok(bindparams_sql_type($i), q{==}, 1, qq{Expect 1 row from bindparams_sql_type($i)}); cmp_ok(bindparams_no_sql_type($i), q{==}, 1, qq{Expect 1 row from bindparams_no_sql_type($i)}); } __END__
The test output:
$ prove bind_param.pl bind_param.......# DBD::mysql: 4.006 # DBI: 1.603 bind_param.......2/3002 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000500)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000501)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000502)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000503)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000504)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000505)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000506)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000507)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000508)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000509)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000510)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000511)' # at bind_param.pl line 105. # got: 0 # expected: 1 # Failed test 'Expect 1 row from bindparams_no_sql_type(670000000000 +0000512)' # at bind_param.pl line 105. # got: 0 # expected: 1 bind_param.......1563/? # Looks like you failed 13 tests of 3002. bind_param....... Dubious, test returned 13 (wstat 3328, 0xd00) Failed 13/3002 subtests Test Summary Report ------------------- bind_param.pl (Wstat: 3328 Tests: 3002 Failed: 13) Failed tests: 1502, 1505, 1508, 1511, 1514, 1517, 1520 1523, 1526, 1529, 1532, 1535, 1538 Non-zero exit status: 13 Files=1, Tests=3002, 2 wallclock secs ( 0.82 usr 0.03 sys + 1.90 cu +sr 0.14 csys = 2.89 CPU) Result: FAIL Failed 1/1 test programs. 13/3002 subtests failed. $
By increasing the value of $range=10000, I can see that the following ranges are affected (and note that the results are reproducible, i.e. identical between successive runs):
6700000000000000500 - 6700000000000000512 6700000000000001536 - 6700000000000002499 6700000000000002561 - 6700000000000003583 6700000000000004400 - 6700000000000004608 6700000000000005632 - 6700000000000005999 6700000000000006500 - 6700000000000007679 6700000000000008500 - 6700000000000008704 6700000000000009400 - 6700000000000009727
I have no explanation to why this affects only certain values. What do the values above have in common? Attempts to reproduce the same behavior and/or explain it are very welcome.

Update 2008-05-28 14:56:44: It's probably a bug in MySQL 5.0.33 that has been fixed in MySQL 5.0.51a:

I have reproduced the same error on MySQL 5.0.51b on x86_64. MySQL 5.0.51a on a i386 returns the expected 1 row for each of the select queries below, while MySQL 5.0.51b on a x86_64 does not:

mysql> select version(); +-----------+ | version() | +-----------+ | 5.0.51b | +-----------+ 1 row in set (0.00 sec) mysql> use test; Reading table information for completion of table and column names You can turn off this feature to get a quicker startup with -A Database changed mysql> drop table if exists foo; Query OK, 0 rows affected (0.00 sec) mysql> create table foo(i decimal(23,0) primary key); Query OK, 0 rows affected (0.02 sec) mysql> insert into foo values(6700000000000000498); Query OK, 1 row affected (0.00 sec) mysql> insert into foo values(6700000000000000499); Query OK, 1 row affected (0.00 sec) mysql> insert into foo values(6700000000000000500); Query OK, 1 row affected (0.00 sec) mysql> insert into foo values(6700000000000000501); Query OK, 1 row affected (0.00 sec) mysql> insert into foo values(6700000000000000502); Query OK, 1 row affected (0.00 sec) mysql> select i from foo where i=6700000000000000498; +---------------------+ | i | +---------------------+ | 6700000000000000498 | +---------------------+ 1 row in set (0.00 sec) mysql> select i from foo where i='6700000000000000498'; +---------------------+ | i | +---------------------+ | 6700000000000000498 | +---------------------+ 1 row in set (0.00 sec) mysql> select i from foo where i=6700000000000000499; +---------------------+ | i | +---------------------+ | 6700000000000000499 | +---------------------+ 1 row in set (0.00 sec) mysql> select i from foo where i='6700000000000000499'; +---------------------+ | i | +---------------------+ | 6700000000000000499 | +---------------------+ 1 row in set (0.00 sec) mysql> select i from foo where i=6700000000000000500; +---------------------+ | i | +---------------------+ | 6700000000000000500 | +---------------------+ 1 row in set (0.00 sec) mysql> select i from foo where i='6700000000000000500'; Empty set (0.00 sec) mysql> select i from foo where i=6700000000000000501; +---------------------+ | i | +---------------------+ | 6700000000000000501 | +---------------------+ 1 row in set (0.00 sec) mysql> select i from foo where i='6700000000000000501'; Empty set (0.00 sec) mysql> select i from foo where i=6700000000000000502; +---------------------+ | i | +---------------------+ | 6700000000000000502 | +---------------------+ 1 row in set (0.00 sec) mysql> select i from foo where i='6700000000000000502'; Empty set (0.00 sec)
--
No matter how great and destructive your problems may seem now, remember, you've probably only seen the tip of them. [1]

Replies are listed 'Best First'.
Re: Typeless bind fails for certain large numbers with MySQL decimal
by moritz (Cardinal) on May 28, 2008 at 12:12 UTC
    My attempts to reproduce this (on a 32 bit platform):
    $ perl 688836.pl 1..3002 ok 1 - Expect delete_all to return 1 DBD::mysql::st execute failed: Duplicate entry '6700000000000000000' f +or key 1 at 688836.pl line 33. execute failed at 688836.pl line 33. # Looks like you planned 3002 tests but only ran 1. # Looks like your test died just after 1. $ echo 'delete from foo' | mysql moritz2 $ perl -Mbigint 688836.pl 1..3002 ok 1 - Expect delete_all to return 1 ok 2 - Expect 1000 insert(s) ok 3 - Expect 1 row from dynamic_sql(6700000000000000001) Segmentation fault

    As I said in the CB, take a look at the trace options in the DBI manpage. Maybe it helps.

Re: Typeless bind fails for certain large numbers with MySQL decimal
by dragonchild (Archbishop) on May 28, 2008 at 12:35 UTC
    Unless you plan on doing various numerical operations (like addition), I would store them as CHAR(19) (or whatever your maximum is). Just because the input is limited to 0-9 doesn't mean you have to store it as a numerical field. Not to mention, actually, that storing it as a CHAR() will still allow for proper sorting (if that is important).

    As for why this is happening, I'm also going to guess that there is a lossy conversion somewhere. The MySQL documentation says that calculations should be accurate up to 65 digits, but it also depends on what Perl, DBD::mysql, and everyone else in the chain is doing.


    My criteria for good software:
    1. Does it work?
    2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
      Howdy!

      I'd put it even more strongly. If the data is a string of digits with no numeric significance, then it isn't numeric data, but character data. Further, if there are values with leading zeros (such as ZIP codes), you have to store it as character data or else jump through extra hoops to reapply the leading zeros. Character data won't give you that problem.

      yours,
      Michael
      I disagree. Most databases (and finally in MySQL 5.0.2 and up) have built in constraint checks that are very valuable. I'm not a US citizen but I strongly suspect a ZIP code containing letters makes little sense. I feel the database should not allow values not adhering to the column specification.
      mysql> drop table if exists foo; Query OK, 0 rows affected (0.00 sec) mysql> create table foo (i numeric(23,0) primary key); Query OK, 0 rows affected (0.01 sec) mysql> select @@global.sql_mode; +-------------------+ | @@global.sql_mode | +-------------------+ | | +-------------------+ 1 row in set (0.00 sec) mysql> insert into foo values ('a'); Query OK, 1 row affected, 1 warning (0.01 sec) mysql> set sql_mode='strict_all_tables'; Query OK, 0 rows affected (0.00 sec) mysql> insert into foo values ('d'); ERROR 1366 (HY000): Incorrect decimal value: 'd' for column 'i' at row + 1 mysql>
      See also: 1.8.6.2. Constraints on Invalid Data
      --
      No matter how great and destructive your problems may seem now, remember, you've probably only seen the tip of them. [1]
        Type constraints are different from check constraints. All databases provide type constraints, MySQL allowing you to turn them on or off as desired. Check constraints, however, are more comprehensive.

        In this case, let's say you have a DECIMAL(23) for this 19 digit account number. What's preventing a 12 digit entry? In other words, the application will need to do a good amount of validation anyways. Anything the database provides in terms of type constraints is bonus. Check constraints, imho, are better implemented as a service wrapping the database connection or as a set of libraries. I personally don't like embedding constraints in the database because I feel that puts too much work into the datastore.


        My criteria for good software:
        1. Does it work?
        2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
Re: Typeless bind fails for certain large numbers with MySQL decimal
by RMGir (Prior) on May 28, 2008 at 12:05 UTC
    At a wild guess, there's a conversion to double happening somewhere that's losing the last few significant digits?

    I have no evidence whatsoever to support that theory, it's just my gut feeling...


    Mike