I have a script that reads a file and puts the data into my log database. The log file data is setup as KEY=>VALUE and as I read it I split it and put it in a hash array called sqlinsert.
I prepare the insert query in advance using this code:
my $fields = join(', ', @dbfields); my $places = join(', ', ('?') x @dbfields); my $insertrow = "INSERT into $table_webreporting ($fields) values +($places)"; $insertrow = $dbhc->prepare($insertrow);
I then insert it with this code:
$insertrow->execute(@sqlinsert{@dbfields});Works fine, untill recently when my log files were changed. In the new log file, each line of the log file doesn't always have all the fields. It only has the ones need for that transaction. This present a problem since I now have empty values. I tried setting up a foreach rounting like this hoping setting undef for the missing values would work, but perl complains when it does the excute about stuff not being defined:
foreach my $temp (@dbfields) { $sqlinsert{$temp} = undef unless ($sqlinsert{$temp}); }
What I like it to do is put a null value in for any field that doesn't have a value.
In reply to DBI and Hash by elmic11111
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |