Hello. I have a script using DBI and Socket::IO. It connects to a one-way datasource using Socket::IO which spits out data. The job of the script is to read the data and format/insert it into a table if a certain condition is met. Pretty simple stuff. But what I am noticing is about a 45% data loss on the inserts (inserts never make it into the table). No errors. nada.
Here is how I set up my DB connection:
sub openDB { my $return; # Input argument: [string(DB Server hostname), String(Database name +to open)] print DEBUGFILE "Entering sub openDB\n" if ($DEBUG); # requires use of require "/home/reporting/nwCentral/dbServers.pl" l +ine my $mysqlHost=shift; my $mysqlDB=shift; my $host = $dbServers{"$mysqlHost"}{'host'}; my $mysqlUser = $dbServers{"$mysqlHost"}{'user'}; my $mysqlPass = $dbServers{"$mysqlHost"}{'pass'}; my $mysqlHostQry = "DBI:mysql:$mysqlDB:$host"; my $dbh=DBI->connect($mysqlHostQry, $mysqlUser, $mysqlPass) or $retu +rn=0; if ($return){ return $dbh; } else { return 0; } }
And for the Socket Connection
sub openDATASOURCE{ print DEBUGFILE "Entering sub openDATASOURCE\n" if ($DEBUG); my $socketInfo=$ipPort{$socketName}; my($remote_host, $remote_port)=split(/ /, $socketInfo); if ($usingSocket) { $DATASOURCE = new IO::Socket::INET->new(PeerAddr => $remote_host, PeerPort => $remote_port, Proto => "tcp", Type => SOCK_STREAM) or $DATASOURCE=0; if ($DATASOURCE) { $return=1; $DATASOURCE->autoflush(1); #Make the IO hot -- disable buffering print LOGFILE "$sysDate: IO: Connected to $remote_host $remote_p +ort\n"; &updateSocketStatus("$$", "1"); } else { $return=0; print LOGFILE "$sysDate: IO: Cannot open datasource: $@\n"; } } return $return; if (!($usingSocket)) { open (DATASOURCE, "<$namedPipe") or die "Couldn't open $namedPipe: + $!\n"; } }
And for how the data is inserted - read the socket, grab the data, format it and push it into an array. The array is then sent to the insert routine:
sub doSQL { print DEBUGFILE "Entering sub doSQL\n" if ($DEBUG); my (@localArray, $sqlStatement, $size)=(); (@localArray)=@_; $size=scalar (@localArray); $sqlStatement="INSERT INTO $liveTable SET @localArray"; if ($usingDB) { my $sth=$mainDBh->prepare($sqlStatement) or &exitAll("notok", "doS +QL:\nQuery= $sqlStatement\nError=" . $mainDBh->errstr); $sth->execute() or &exitAll("notok", "doSQL:\nQuery= $sqlStatement +\nError=" . $mainDBh->errstr); } print DEBUGFILE "$sqlStatement\n" if $DEBUG; }
As you can see, my DEBUG information contains the exact query sent to the DBI handler. I have identified the missing rows and performed a manual execution of the query without problems. Again, no errors were reported in any log.
I have also ensured the MySQL configuration is identicle to another server performing a similar task (different formatting). The DB is not showing any issues, only about 8 to 30 queries/second, but I am not refusing to believe it is in fact a DB configuration/performance issue (the DB files are on SAN -- which is a difference).
I have a workaround: load the data into a file and do a load data infile every 30 seconds. However this is bugging me since it SHOULD work (and I hate writing temp files). Come to think of it, maybe my error catching is problematic?
Apologies for the lengthy post, but wanted to provide detail. Thank you for taking the time to read this.
MattIn reply to MySQL DBI an Socket::IO dropping inserts somewhere by whiteonline
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |