js1 has asked for the wisdom of the Perl Monks concerning the following question:
Perl Monks,
I have a routine to update a number of rows for one column in a database table. I want to speed this up though because at the moment it runs an SQL update for each row. For approx 10000 it takes 2 secs.
sub process_urlhits_update{ my $sql1 = qq {UPDATE url_table SET hits}; my $sql2 = qq {WHERE url}; my $sql; foreach my $url ( keys %urlhits_update ){ $sql= qq { $sql1=$urlhits_update{$url} $sql2="$url" }; my $sth = $dbh->prepare( $sql ); $sth->execute(); $urlid_hash{$url}=get_urlid($url) if ! defined $urlid_hash{$ur +l}; delete $urlhits_update{$url}; } }
I tried changing the code to write to a file and then run one SQL command to LOAD FILE. This runs quicker at about 1.3 secs for 10000 rows:
sub process_urlhits_update{ my $sql1 = qq {\nUPDATE url_table SET hits}; my $sql2 = qq {WHERE url}; my $sql; open(FH,">/tmp/updates.txt")||die("could not open updates file"); foreach my $url ( keys %urlhits_update ){ $sql .= qq { $sql1=$urlhits_update{$url} $sql2="$url" }; $urlid_hash{$url}=get_urlid($url) if ! defined $urlid_hash{$ur +l}; delete $urlhits_update{$url}; } print FH $sql; $sql="LOAD DATA LOCAL INFILE '/tmp/updates.txt' INTO TABLE url_tab +le"; my $sth = $dbh->prepare( $sql ); $sth->execute(); }
However, my 2 routines which do bulk SQL inserts only take 0.26 secs.
Is there anything I can do to improve my code above? For example, can I run LOAD FILE without actually writing to disk ? or can you do bulk updates?
Thanks as always for your wisdom.
js.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: speedy update routine
by Zaxo (Archbishop) on Jun 12, 2004 at 21:25 UTC | |
by js1 (Monk) on Jun 12, 2004 at 21:48 UTC | |
by Zaxo (Archbishop) on Jun 12, 2004 at 22:17 UTC | |
|
Re: speedy update routine
by exussum0 (Vicar) on Jun 12, 2004 at 23:48 UTC | |
|
Re: speedy update routine
by saberworks (Curate) on Jun 13, 2004 at 20:57 UTC |