Well, the script now ran through and again put only the correct files in my "clean" file, but for the error file, I now have 1138 lines, where I should only have 272 lines. When I did a "head" on that error file, I find that it's listing each field as a separate line and no comma delimiters i.e. -
650187016
2
1
checked out under cash
650200678
1
1
HIT CASH TWICE
650096506
1
whereas I want the lines put back in their original form, all commas included, i.e.
650187016,2,1,checked out under cash ,,,,,,,,,,,,,,,,,,,,,,,,,,,,
| [reply] |
Nope! STILL didn't work! Now it jumbled it all the fields from the "error" records together into one just line with spaces and no commas. My errorFiles.csv has just one line with every field from the error records.
if (!length $fields[28]) {
print $ERR_FH join (',', $_) for @fields;
}
else
...is what I'm using. So what else could be missing? | [reply] [d/l] |
| [reply] [d/l] |
print $ERR_FH join(',', @fields);
| [reply] [d/l] [select] |
So close - all the commas are now back in, but it's still putting all the fields in as one big long sole record.
| [reply] |
if (!length $fields[28]) {
chomp @fields;
print $ERR_FH join (',', $_) for @fields;
}
| [reply] [d/l] |
No - that did not do it either. It's as if it's not joining the fields back together with the commas to print each line. It's still writing every field that is not empty into the errorFiles.csv with a space in between each field as one continuous line.
| [reply] |
Wait - never mind! Think I caught what else I left in that statement that caused the problem. Am running it again now to verify. | [reply] |