Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl Monk, Perl Meditation
 
PerlMonks  

Issue with simultaneous MySQL actions

by Anonymous Monk
on Jul 31, 2003 at 20:00 UTC ( [id://279756]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

I've got a Perl script that uploads a file (usually quite large) and writes the data into a MySQL table. However, when I try running the script on another window while an older instance is still processing, this new data doesn't get recorded. To me it sounds more like a MySQL problem, but I'm not really sure 'what' it is.

Any hints or tips would be really helpful, because I'm totally lost with this.

Thanks,
Ralph.

Replies are listed 'Best First'.
Re: Issue with simultaneous MySQL actions
by Cine (Friar) on Jul 31, 2003 at 20:02 UTC
    We cant give you any hints, if you dont give us a proper way of analyzing your problem. We need sample code at the very least. Just saying "this doesnt work" is just not good enough,,,

    T I M T O W T D I
Re: Issue with simultaneous MySQL actions
by Anonymous Monk on Jul 31, 2003 at 20:07 UTC
    $dbh = DBI->connect("dbi:mysql:db","","") or die "Error db1";

    $sth = $dbh->prepare("select max(id) from tracker where agent='abc'");
    $sth->execute;
    $newid = $sth->fetchrow_array;
    $sth->finish;

    if($newid eq '') { $newid = 1; }
    else { $newid++; }

    $msubject = $fr{subject}; $msubject =~ s/\'/\\'/g;

    $sth = $dbh->prepare("insert into tracker values ( 'abc', $newid, '$msubject', '$todaydate', '$emailto', $sntcount, 0, 'eCard', '' )");
    $sth->execute;
    $sth->finish;

    $dbh->disconnect;


    That is the code... How come it doesn't write simultaneously? Any ideas?

      There's a possibility it's being overwritten, since you're generating your row ids by hand. You could lock the table while you're doing the insert, which is usually a bad idea. You could wrap the whole transaction in a transaction, if you're using MySQL 4.x (people who want to complain and bandy about the phrase "ACID", go waste your life in Slashdot discussions; you guys bore me, at best).

      I'd rather make the id column AUTO_INCREMENT, though, in which case you don't have to insert anything and it generates a unique ID automatically.

      Other possibilities are that your uploaded file is hitting a resource limit or that your server disallows multiple CGI programs to run simultaneously. This really feels like a concurrency issue, though.

      That code has a race condition that could explain the trouble you're having. Here's an example.
      1. prog1 executes the select statement. max(id) is currently 3.
      2. prog 2 executes the select statement. max(id) is still 3.
      3. prog1 finishes out the code, incrementing id and then inserting all of the information for record with id 4.
      4. prog2 finishes out the code, incrementing id and then inserting all of the information for record with id 4.

      Depending on how your system is set up, the result could be that the second request is rejected (you don't do any error checking in your example, so you'd never know) or that both are inserted, and you only look at the first one your query returns (for example, in your sample select statement, you just ignore any additional rows that the query returns).

      The solution is what chromatic suggested---make this action atomic somehow. Use an auto_increment variable, lock the table, or incorporate finding the ID into the SQL statement. As a kludgey alternative, require that the id column be distinct, then catch the error resulting from your insert statement, and get the new biggest id and try again.

Re: Issue with simultaneous MySQL actions
by cianoz (Friar) on Aug 01, 2003 at 09:13 UTC
    i see at least 2 issues in your code:
    1) you should use an AUTO_INCREMENT column ti store id as stated by others
    2) you should use $dbh->quote() instead of $msubject =~ s/\'/\\'/g
Re: Issue with simultaneous MySQL actions
by Anonymous Monk on Jul 31, 2003 at 22:04 UTC
    I've changed it for the following now:

    $dbh = DBI->connect("dbi:mysql:db","","") or die "Error db1";
    $msubject = $fr{subject}; $msubject =~ s/\'/\\'/g;

    $sth = $dbh->prepare("insert into tracker ( agent, subject, date, receiver, sent, etype ) values ( 'abc', '$msubject', '$todaydate', '$emailto', $sntcount, 'eCard' )");
    $sth->execute;
    $sth->finish;

    $sth = $dbh->prepare("select max(id) from tracker where agent='abc'");
    $sth->execute;
    $newid = $sth->fetchrow_array;
    $sth->finish;

    $dbh->disconnect;

    The 'id' is now auto incremental. However, I have to execute the max(id) query after writing the data cause I need to fetch the newest record. Do you think this will work better?

    Thanks,
    Ralph.
      you should consider storing an agent's max id in another table. ( the following is MS-SQL compliant.. your flavor may require something slightly different )

      you can quickly create this table like so:

      select agent,max(id) into agent_id from tracker group by agent
      and then, before running your insert, run a transactioned query to get and set the max id of an agent
      begin tran set nocount on update agent_id set ID = ID + 1 where Name = 'abc' select ID from agent_id where Name = 'abc' commit tran
      this should ensure you have no ID collissions

      HTH!

      Update: I just read the auto-incremental part of your post... which pretty much squashes this reply. shame on me.

      ..barely started my morning coffee, so it's possible I'm elsewhere still, but I think you want mysql_insert_id which will return the auto incremented id for your last insert.

      cheers,

      J

      It's in the manpage :)
      mysql_insertid MySQL has the ability to choose unique key values automatically. If this happened, the new ID will be stored in this attribute. An alternative way for accessing this attribute is via $dbh->{'mysql_insertid'}. (Note we are using the $dbh in this case!)


      T I M T O W T D I
Re: Issue with simultaneous MySQL actions
by graff (Chancellor) on Aug 01, 2003 at 03:45 UTC
    One other approach you could consider (as a last resort, perhaps, if the mysql-centric approaches fail), is to have your perl script use a semaphore file, such that whenever there are multiple instances of the script running simultaneously, the chunk of code that does the database transaction can be made atomic, and when two or more processes try to do that chunk at the same time, the semaphore file will "serialize" them.

    I posted a pretty good/simple semaphore file module here (drawn from a Perl Journal article by Sean Burke).

    I actually had a recent case using Oracle on a solaris server, with a remote windows-2000 box running up to 12 simultaneous instances of a given perl script, each instance doing a pretty heavy query-for-update; when I left it to oracle to manage the contention, it seemed like the delays in servicing the queries were multiplicative -- time to complete a given query varied according to the number of pending requests, and as a result, some requests were pending for a very long time); when I included the use of a semaphore file in the script, so that the queries were serialized on the windows box, the delays became simply additive -- now the actual query execution always took roughly the same time.

Re: Issue with simultaneous MySQL actions
by Anonymous Monk on Aug 01, 2003 at 19:34 UTC
    I've changed the code but now I discovered the following in the error_log for Apache:
    DBD::mysql::st execute failed: You have an error in your SQL syntax near 'agostino@mycomcast.com,karen@adkinsrealty.com,nmanning@realtor.com,team1@atlanti' at line 1

    I should try $dbh->quote() then, right? Cause now it seems like another issue with the actual data I'm sending to MySQL.

    Thanks,
    Ralph

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://279756]
Approved by Cine
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (4)
As of 2024-03-29 10:07 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found