herda05 has asked for the wisdom of the Perl Monks concerning the following question:
I'm new to DBI. I read through the DBI doc on cpan http://search.cpan.org/~timb/DBI/DBI.pm, and was particularly interested in using place holders. However, in practice I'm finding them very limited. I'm hoping it's my inability to create the correct data structures around them. I'm using Sqlite 3.5.9 at the backend.
What I have is a file of objects that are made up of anywhere from 4-8 commands. There is no marker for a definition ending, only a beginning. I slurp the file into an array then cycle through creating a second array of just a single object definition. I then use a case statement to identify the commands and values. After updating or inserting to tables, I grab the next object definition and the process starts again.
foreach my $line (@jobDef) { chomp($line); $line =~ s/\s+$//g; switch ($line) { case (/DOCOMMAND/) { $line =~ s/\s+$//g; #This key word is used for bianry commands; It is +mutually exclusive with SCRIPTNAME $command = substr($line, 11); $stmnt{ $command } = "UPDATE twsJobs SET command = + ? WHERE jobId = ?"; } case (/SCRIPTNAME/) { #This key word is for scripts; It's mutually exclu +sive with DOCOMMAND $command = substr($line, 11); $stmnt{ $command } = "UPDATE twsJobs SET command = + ? WHERE jobId = ?"; } ...
As you can see here, I'm creating a separate SQL statement for each command and then creating a hash of definition->SQL. I then cycle through the hash at the end, feeding each key and value to a sub that takes the $dbh,$value,$statement, and $rowId (produced form an earlier SELECT statemetn adn put into @joid) and runs prepared_cached and execute methods from DBI.
foreach my $jobId (@joid) { #print __LINE__ . " UPDATE twsJobs SET $update WHER +E jobId = $jobId\n"; #Unpack our hash and send the key,value over to upd +ateTableSingle while (my ($k,$v) = each (%stmnt)) { $dbh = updateTableSingle($dbh,$k,$v,$jobId); } } ...
This is sloooow compared to working with a HOH, which I am currently trying to switch from. A test file with a couple hundred defined objects takes about 7 minutes, whereas with HOH of it took less than a minute. The real data will have tens of thousands of object definitions, and I'm afraid of how long that will take. HOH takes about 3~4 minutes. The answer seems to be in leveraging prepared_cache and place holders. Placeholders don't seem to work if I don't know the column name, the value, and the rowId though.
Is there a way I can leverage placeholders and prepared cache if that many values change on each statement? I tried using simple variables, but single quotes and other random characters break the SQL, and every recommendation is to use placeholders.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Using prepared_cache with placholders
by Anonymous Monk on Apr 03, 2010 at 03:52 UTC | |
by herda05 (Acolyte) on Apr 03, 2010 at 22:31 UTC | |
by ikegami (Patriarch) on Apr 04, 2010 at 00:05 UTC | |
|
Re: Using prepared_cache with placholders
by CountZero (Bishop) on Apr 03, 2010 at 18:47 UTC | |
by herda05 (Acolyte) on Apr 03, 2010 at 23:01 UTC | |
by CountZero (Bishop) on Apr 04, 2010 at 06:58 UTC | |
by Anonymous Monk on Apr 03, 2010 at 23:49 UTC |