http://qs1969.pair.com?node_id=481173


in reply to Re: Splitting multiline scalars into different array entries
in thread Splitting multiline scalars into different array entries

Sorry, should have provided wanted output. That's what I ended up with on my first try :-) wanted output:
$VAR1 = [ [ 'single', 'cell', 'values' ], [ 'are', 'really', 'easy' ], [ 'but', 'these', 'aren\'t' ], [ 'multiline', 'cells', 'suck' ], [ 'stilton', 'is', 'great' ], [ 'back', 'to', 'life' ], [ 'back', 'to', 'reality', 'with', 'more', 'cells' ] ];
Cheers anyway!

davis
Kids, you tried your hardest, and you failed miserably. The lesson is: Never try.

Replies are listed 'Best First'.
Re^3: Splitting multiline scalars into different array entries
by polettix (Vicar) on Aug 05, 2005 at 11:12 UTC
    So you need to transpose - this thread should help.

    Flavio
    perl -ple'$_=reverse' <<<ti.xittelop@oivalf

    Don't fool yourself.
Re^3: Splitting multiline scalars into different array entries
by Tanalis (Curate) on Aug 05, 2005 at 11:19 UTC
    I'm certain that this isn't the best way, but it works:
    foreach my $row (@AoA) { my @foo = break_up( $row ); if( $foo[0] =~ /ARRAY/ ) { push @new_rows, @foo; } else { push @new_rows, [ @foo ]; } } print Dumper(\@new_rows); sub break_up { my $row = shift; my $uses_newlines = 0; $uses_newlines++ if join( "", @$row ) =~ /\n/m; return @$row unless $uses_newlines; my @tmp; map { push @tmp, [ split /\n/ ] } @$row; my @row; for( my $i = 0; $i <= $#tmp; $i++ ) { for( my $j = 0; $j <= $#tmp; $j++ ) { $row[$i][$j] = shift @{$tmp[$j]}; } } return @row; }
    giving an output of

    Interesting problem :) What's this for, if I can ask?

      Yep, that certainly works! Cheers! I'm going with broquaint's solution however.
      Interesting problem :) What's this for, if I can ask?
      Yeah, sure. I've got to process some (well, lots of) PDF files. I'm converting them to .xls (Excel) format with PDFConverter, then processing the file with Spreadsheet::ParseExcel. Even though PDFConverter's pretty good, it doesn't always cope with some tables, leaving me a mess similar to my example data.

      davis
      Kids, you tried your hardest, and you failed miserably. The lesson is: Never try.