In my own code base I referenced, its actually reporting on metrics for some traffic. So instead of me needing to later loop over the array, and do something like
# isnt actual code but displays idea
my $f = $array[$i] || 0;
$sql .= '"$f", ';
I can now simply do
$sql .= join('", "', @array) . '");';
And I know my values are sane.
I guess in the case of the OP this isnt a factor, but in my case I initialize to a number, because all my values are numbers, and I think it makes it slightly easier at run time on perl. If we simply created place holders in the array Im not sure what data type perl would use behind the scenes, or rather how much space would be allocated to each element, nor how much work Perl would have to do behind the scenes to go from undef -> val -> bigger val. I am also dealing with what I consider monster data sets. I process about 22Gb of data, then another 15Gb or so, and correlate info. If I can drop a few tests from loops, or make my run time a little quicker, then I do.
Or I could just be lazy and didnt want to add another test inside of a loop later, when I could deal with it up front and just go :)
Update: Heh, I think I miss interpreted the question asked so let me add this.
I guess it honestly doesn't matter how you test. Looking at the requirements where I assume the info will always be digit.string, then using the value as a boolean is 'OK' to determine what we are gonna do. If the value could be zero then I would have used defined like you suggested.
MMMMM... Chocolaty Perl Goodness..... | [reply] [d/l] [select] |