lorenzov has asked for the wisdom of the Perl Monks concerning the following question:

hello, i have some code which is parsing a text file, split lines and passes items to fields in a database; the problem is that the files are long and the procedure eventually looses connection with the database with an error of this sort:
DBD::mysql::st execute failed: Lost connection to MySQL server during query at C :\workspace\temp\data\2006-07Logs\logToDB7a.pl line 102, <FILE> line 85123.

i'm sure that my newby knowledge of perl doesn't help, but is there anyone that can help me to optimise the code and/or find a way to keep the connection live when something goes wrong?
much appreciate your expert views!
here is the code:
# use modules: #use strict; use DBI; use Mysql; use IO::Handle; use Text::ParseWords; # *************************************** # this is the core to connect to the database my $database="myDB"; my $tableName="PS0002_9"; my $file = "PS0002_9_2006.csv"; # ****************************************** my $hostname ="myhost"; my $user="myuser"; my $passwd="myPass"; my $count = 0; # ************************************** # CORE: connect to the DB and throw an error if not possible my $dsn = "DBI:mysql:host=${hostname};database=${database}"; my $dbh = DBI->connect ($dsn, $user, $passwd,{RaiseError => 1, AutoCom +mit => 1}) or die "Cannot connect to server\n"; # ********************************** #SQL qry for creating the Table my $tblcreate= " CREATE TABLE IF NOT EXISTS $tableName ( pkey int(11) NOT NULL auto_increment, courseID VARCHAR(80), sessionID INT, userID VARCHAR(20), personID INT, myTimestamp DATETIME, Millisec INT UNSIGNED, ACTION_NAME VARCHAR(80), TOOL_NAME VARCHAR(80), PAGE_NAME VARCHAR(200), Time SMALLINT, PRIMARY KEY (pkey) ) "; # Prepare & execute qry my $s = $dbh->prepare($tblcreate); $s->execute(); # ************************************* # INSERT query # Prepare the insertion statement. $sth = $dbh->prepare("INSERT INTO $tableName(courseID,sessionID,userID +,PersonID,myTimestamp,Millisec,ACTION_NAME,TOOL_NAME,PAGE_NAME,Time)" +. " VALUES (?,?,?,?,?,?,?,?,?,?)"); open(FILE, '<:raw:encoding(UTF-16le):crlf:utf8', $file) or die("Could +not open $file: $!."); $count = 0; print ("myCounter: ", $count, "\n"); # Process the data. while (<FILE>){ # Lines used to debug the script. #exit if ($count >= 20); #print "=========================== Record $count\n"; $count++; @words= $_ =~ m/"[^"\r\n]*"|[^,\r\n]*/mg; @result = &quotewords(',', 0, @words ); #@result= $_ =~ m/"[^"\r\n]*"|[^,\r\n]*/mg; my $courseID=$result[0]; my $sessionID=$result[1]; my $userID=$result[2]; my $PersonID=$result[3]; my $mydatetime=$result[4]; my $millisecond=$result[5]; my $loc=$result[6]; my $action=$result[7]; my $page=$result[8]; my $time=$result[9]; # Write the entries to the database. $rows = $sth->execute($courseID,$sessionID,$userID,$PersonID,$myda +tetime,$millisecond,$loc,$action,$page,$time); #print "rows: ", $rows, "\n"; print STDERR "Progress: ", $count, "\r"; } # *************************************** # SELECT query my $select=" select ALL from logTest"; #$s = $dbh->prepare($select); #$s->execute(); #while(my @val = $s->fetchrow_array()) #{ #print " $val[0] $val[1] $val[2]\n"; #++$count; #} # ************************************* # TERMINATE & close conn $s->finish(); $dbh->disconnect ( ); exit (0);

Replies are listed 'Best First'.
Re: log to DB connection issue & optimization
by moritz (Cardinal) on Jun 03, 2008 at 13:29 UTC
    Normally such a thing only happens when a connection isn't used for a long time. But you are calling ->execute for each row in the DB, so I'm a bit puzzled as to why this should happen in your case. Are you sure that the DB engine doesn't die for some obscure reason? (maybe no space left on device).

    BTW please don't comment out the use strict; line - it's your most valuable ally when hunting errors.

    Second BTW: you don't need all of those intermediate values, $sth->execute(@result) should work just fine.

Re: log to DB connection issue & optimization
by perrin (Chancellor) on Jun 03, 2008 at 18:37 UTC
    MySQL has timeout settings which you can set in your my.cnf file. Make them much bigger and see if that fixes your problem.
Re: log to DB connection issue & optimization
by chrism01 (Friar) on Jun 04, 2008 at 02:04 UTC
    ... and check each execute for success/failure