in reply to DBI::SQLite slowness

It is slow because you have AutoCommit set to 1. It is committing for every insert. Just change that to 0 and $dbh->commit; after the foreach loop.

Replies are listed 'Best First'.
Re^2: DBI::SQLite slowness
by Endless (Beadle) on Sep 20, 2013 at 12:23 UTC
    Brilliant! With that little fix, my speed is up to 2022 per second; that's almost workable, and I understand what was happening. Now time to start looking through the other suggestions.
      This is what I get on an Atom eee pc (1.6Ghz), after I removed
      use v5.16.0;
      and changed
      say "Total time: ", (time - $start); # 180 seconds
      to
      print "Total time: ", (time - $start); # 180 seconds
      time perl db.pl Total time: 5 real 0m5.348s user 0m0.360s sys 0m0.820s
      marica.fr : Gestion des contrats, des dossiers contentieux et des sinistres d'assurance
        I've never heard reports that 5.16.0 will significantly slow a program, or that say is so much slower. What's going on here?

      Well, 200 millions records at a rate of 2000 per second, that's still 100,000 seconds, or almost 28 hours. That's still pretty long, isn't-it? Having said that, you may be able to live with that, a full day of processing is still manageable for a number of cases. Beware, though, that the rate might slow down as you database grows larger.

      If you are really only looking for filtering out duplicates, the ideas discussed by BrowserUk are probably much better than using a database.