in reply to RE: Re: Simple Text Conversion
in thread Simple Text Conversion

Yes, but then, mine worked. :) Just kidding.

You're just pushing each line arbitrarily onto @recs--you need to group them in sets of 5, because that's what the original file looked like. Plus why are you using foreach in the original reading-in-the-file loop? That

foreach my $line (<FH>)
reads the entire file into a temp array--not a horrible thing in many cases, but still, if we can process on a line by line basis, we may as well do so. :)

I like your use of split quite a lot, though.

So, combining your ideas and mine:

use strict; open FH, "foo" or die "Can't open: $!"; my @recs; while (<FH>) { chomp; push @{$recs[int(($.-1) / 5)]}, split /,?\s/; } close FH or die "Can't close: $!"; open FH, ">foo.new" or die "Can't open: $!"; print FH map join('|', @$_) . "\n", @recs; close FH;
Notice I took the array slice out--I think the op wanted everything in the array. If not, though, he/she should stick it back in, just
@$ref[0..4]
instead of
@$ref
And I'm now using map, just cause map is great.

Replies are listed 'Best First'.
Why I Do What I Do
by chromatic (Archbishop) on Apr 06, 2000 at 22:57 UTC
    I'm happy with the resulting script. It's far more clear and far less ugly. Regarding some of the bogosities in my postulate:
    • Obviously, the line-by-line processing depends on whether the data file has each group of records on one line, or each record on a separate line. I went for the first assumption (as it makes more real-world sense to me) and you went for the second (as you had the patience to View Source).
    • That also lets me get away from the push reference position bit. Not a big loss.
    • I thought avoiding the implicit local $_ behavior in the while loop would be more clear for the OP.
    • People don't use split and join nearly enough.
    Well done.