in reply to Re^2: Efficient way to handle huge number of records?
in thread Efficient way to handle huge number of records?
In the past (last time I tried was, I think, a couple of years ago) SQLite always proved prohibitively slow: loading multimillion-row data was so ridiculously slow
I said "handle" not "handle well" :)
That said, I had SQLite on my old machine and found that .import file table via the sqlite3.exe was substantially faster than doing inserts via SQL. Whether from the command line utility or via Perl & DBI.
I wish I could get a 64-bit build for my system.
|
---|