Many ways to do this. I'm going to read the text file into a data structure that mirrors the output, rather than mirroring what the input looks like. This way, I'm going to parse directly into the records. What I could have done was to read into a series of hashes instead. This just proves it's doable:
use strict; use Text::CSV; my @data; my %header; { my $idx = 0; my @single_row; sub eos { if (@single_row) { push @data, [ @single_row ]; @single_row = (); } } sub index_for { my $key = shift; unless (exists $header{$key}) { $header{$key} = $idx++; } $header{$key}; } while(<DATA>) { chomp; if (/^$/) { # end of section eos(); } else { my ($key, $value) = split /\s*:\s*/, $_, 2; $single_row[index_for($key)] = $value; } } eos(); } my $csv = Text::CSV->new(); my @h = sort { $header{$a} <=> $header{$b} } keys %header; $csv->combine(@h); print $csv->string(), $/; $csv->combine(@$_), print $csv->string(), $/ foreach @data; __END__ a: a1 b: b1 c: c1 a: a2 b: b2 c: c2 a: a3 b: b3 c: c3 a: a4 b: b4 c: c4 d: d1
In reply to Re: Record dump-style text --> CSV?
by Tanktalus
in thread Record dump-style text --> CSV?
by fallenmonkey
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |