elmic11111 has asked for the wisdom of the Perl Monks concerning the following question:
I have a script that reads a file and puts the data into my log database. The log file data is setup as KEY=>VALUE and as I read it I split it and put it in a hash array called sqlinsert.
I prepare the insert query in advance using this code:
my $fields = join(', ', @dbfields); my $places = join(', ', ('?') x @dbfields); my $insertrow = "INSERT into $table_webreporting ($fields) values +($places)"; $insertrow = $dbhc->prepare($insertrow);
I then insert it with this code:
$insertrow->execute(@sqlinsert{@dbfields});Works fine, untill recently when my log files were changed. In the new log file, each line of the log file doesn't always have all the fields. It only has the ones need for that transaction. This present a problem since I now have empty values. I tried setting up a foreach rounting like this hoping setting undef for the missing values would work, but perl complains when it does the excute about stuff not being defined:
foreach my $temp (@dbfields) { $sqlinsert{$temp} = undef unless ($sqlinsert{$temp}); }
What I like it to do is put a null value in for any field that doesn't have a value.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: DBI and Hash
by runrig (Abbot) on Feb 13, 2012 at 21:36 UTC | |
|
Re: DBI and Hash
by ww (Archbishop) on Feb 13, 2012 at 21:42 UTC | |
by elmic11111 (Novice) on Feb 13, 2012 at 23:28 UTC | |
by Anonymous Monk on Feb 14, 2012 at 08:57 UTC |