One flaw in you code is that you try to close $infile, where $infile is the file name, not the file handle.
And I wonder how you can think that
while (my $line = <CSVFILE>) { $line =~ tr/"\r\n//d;
is ever going to do what you think it is doing. The diamond operator reads lines based on $/, which is most likely a newline, so if the line read contains a newline, it will be the last character of $line in this loop, and tr// is not what you want, but chomp would be.
If your goal however is to delete \r and \n from the CSV fields after the line has been read, this will horribly fail. You should use a CSV parsing module instead, rendering the first loop into something like:
use Text::CSV_XS; my $csv = Text::CSV_XS->new ({ binary => 1, auto_diag => 1 }); open my $fh, "<", $infile or die "$infile: $!\n"; while (my $row = $csv->getline ($fh)) { $row->[1] =~ tr/"\r\n"//d; my (undef, @prefinal) = split m/\\/ => $row->[1]; push @final, [ @prefinal, $row->[0], $row->[3] ]; } close $fh;
Text::CSV will work the same, but Text::CSV_XS is much faster.
In reply to Re: Veriable Length Array/Hash derived from CSV to populate an XML
by Tux
in thread Veriable Length Array/Hash derived from CSV to populate an XML
by TheBigAmbulance
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |