i would strongly agree with others who have recommended use of a standard,
well tested csv-parsing module.
however, if you really have to go commando (and do it on the command line
to boot), the following (tested) code should work.
note that:
-
NSRLFile.csv contains the example data from the op
-
NSRLFile.psv contains the processed, pipe character-separated data
-
the code makes no assumptions about the absence or presence whitespace
around the value-separating commas or at the beginning or end of records
-
double-quoted strings can contain anything, even backslash-escaped
double-quotes (or any other backslash-escaped character)
-
the only value supported other than a double-quoted string is a simple
numeric with NO sign, decimal point or embedded commas
-
the code makes no assumptions about the order of the fields in a record
-
had to use \042 in place of a double-quote character because otherwise,
the xp command-line interpreter gets confused and thinks the first pipe
character it sees (in a regex) is an actual command-line pipe operator
perl -wlpe "use strict; my $quo = qq(\042); my $esc = qq(\\\\);
my $body = qr/[^$quo$esc]/; my $escaped = qr/$esc ./x;
my $sep = qr/ \s* , \s* | \s* \Z /x;
my $quoted = qr( $quo $body* (?: $escaped $body* )* $quo )x;
my $number = qr(\d+);
$_ = join '|', m/( $quoted | $number ) (?= $sep )/gx"
NSRLFile.csv > NSRLfile.psv