in reply to DBI: speed up insertions in generating database

Thanks all for the suggestsions - common thread = do all theinserts for onew table at a time.

So, I've re-written the parsing code to store each table's values in a separate text file temporarily. Then I'l run a separate importation routine that imports table by table.

The new text-file-storage parser is running right now at about 2,400 origin file lines per hour (there hundreds of values to extract from each line after comparison with criteria in another file - very convoluted), whereas the direct db-storage program was running at ~300 origin-file-lines per hour.

  • Comment on Re: DBI: speed up insertions in generating database

Replies are listed 'Best First'.
Re^2: DBI: speed up insertions in generating database
by hakkr (Chaplain) on Dec 04, 2004 at 16:45 UTC

    Yep defo write out a csv file and then use mysqlimport or a 'Load data infile' statement for each table. If you do stick with DBI Inserts maybe try the inserts with multiple VALUES lists

    INSERT INTO $table (field1,field2) VALUES (value1,value2),(values3,val +ues4);
    From the mysql docs..

    When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using a lot of INSERT statements. See section 13.1.5 LOAD DATA INFILE Syntax.

    mysqlmonks.org