in reply to Re^2: Multiple queries on DBI corresponding to multiple csv files?
in thread Multiple queries on DBI corresponding to multiple csv files?
Normally, you don't have to fiddle with "AutoCommit" as the default is "true", at least it is with SQLite. Basically you want write operations to commit automatically unless you override that behavior explicitly.
A commit operation is very expensive - the DB has to do a lot of work to make sure the data is "really there". The DB will have a limit on the number of transactions per second (basically commits). The number of operations per second will be more than an order of magnitude more than that. One reason to manually control the start/conclusion of a transaction is when doing a lot of inserts at once. For example:
The above code slams data from an Excel spreadsheet into a DB. The sequence is typical. An SQL statement is prepared outside the loop and then executed many times with different data values. An SQL prepare statement is table specific, so this can't be done with a table name as a variable. The above code has been in production for a year. It takes <1 second to run. If I take out the BEGIN and COMMIT statements, it takes ~10 seconds -> 10x difference. Without the BEGIN and COMMIT statements, each insert would be committed automatically. You can run 1 million inserts as a single transaction and I have code that does that. This can make the difference between say 20 seconds and 4-6 hours!my $addRoster = $dbh->prepare ("INSERT INTO Roster (Call,Owner) VALUES (?,?)"); $dbh->do("BEGIN"); # start a new transaction for all inserts foreach my $row (@$info) #the 'roster' sheet of Roster.xls #20 Feb 2 +018 { ...blah...blah... $addRoster -> execute ($callsign, $owner); } $dbh->do("COMMIT"); #massive speed up by making one transaction!!
|
|---|