(not tested, but it does compile)my @insert_cols = qw/BS queue_name in_q_per in_q_num out_q_per out_q_n +um/; my @inner_keys = @insert_cols[2..5]; my $dbh = DBI->connect( 'whatever...', 'name', 'pswd' ); my $insert_sth = $dbh->prepare( 'insert into my_table (' . join( ',', @insert_cols ) . ') values +(' . join( ',', ('?') x @insert_cols ) . ') +' ); for my $BS ( keys %HoH ) { for my $queue_name ( keys %{ $HoH{ $BS }} ) { my @nums = @{ $HoH{ $BS }{ $queue_name }} { @inner_keys }; $insert_sth->execute( $BS, $queue_name, @nums ); } }
However, if there's lot's of data to be inserted, you might want to consider just printing the rows to a tab-delimited text file, and use whatever tool your database server provides for doing bulk inserts from such a file. DBI tends to be very slow with inserts, compared to a compiled utility that is native to the DB server -- we're talking about differences of 10-to-1 or more in wall-clock time. (If it's not a lot of data, it doesn't matter, but when it gets into thousands of rows, you'll really notice the difference.)
Note that your perl script could both write the text file and then use "system()" to run the native bulk loader tool on that file, to keep the process fully integrated. It's a matter of replacing the $sth->execute(...) with print TSV join( "\t",...),"\n" and adding the necessary "open TSV, ...", "close TSV" and "system" calls around the for loop shown above.
In reply to Re: Hash of Hash to mysql
by graff
in thread Hash of Hash to mysql
by NothingInCommon
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |