This works, but is very slow. I can't get around the repeated prepares because the table changes with each iteration and there is way too much data (~1Gb) to hold it all in table-keyed hashes ready for a table-wise insertion at the end.pseudo code: open FILE while ($line = FILE) { analyze $line to extract ($table, $value) pairs foreach ($table_$value_pair) { $sql = "INSERT INTO ".$table." VALUES(".$value.")" prepare execute() } }
One option I'm considering is doing it in blocks, storing for a while, then purging:
I figure this should help, but not sure how much I will actually gain by that.open FILE $index = 0; while ($line = FILE) { analyze $line to extract ($table, $value) pairs push each found $value onto $values_hash{$table} as 2-d array so +$values_hash{$table} = a list of $values $index++; when $index reaches, say, 10,000 or 100,000 { foreach $key (keys $values_hash) { $sql = "INSERT INTO ".$table." VALUES(?)" prepare foreach $value in list $values_hash{$key} execute($value) } } clear $values_hash ready to start storing again } }
Are the are any other slick speed tricks? I vaguley remember something to do with pre-generating csv files and then importing....
In reply to DBI: speed up insertions in generating database by punch_card_don
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |