in reply to [Raku] Limited argument list issue in IO::String (Text::CSV)
G'day fishy,
I work with biological data that's often of comparable size — I have a 2GB CSV file that I use for volume testing. I would never attempt to slurp an entire file of this size; instead, I would read and process line-by-line. Apart from the huge memory overhead, you're reading the data twice: once to slurp; again to process.
I don't have Raku available. Here's a quick-and-dirty example in Perl5. I've mostly used the same variable names as your code; hopefully, a conversion to Raku would not be too difficult for you.
Input CSV:
$ cat dummy.csv a,b,c d,e,f g,h,i
Perl5 script column_extract.pl:
#!/usr/bin/env perl use strict; use warnings; use autodie; use constant TARGET_COL => 1; use Text::CSV; my $file_name = 'dummy.csv'; my @columns; my $csv = Text::CSV::->new(); open my $fh, '<', $file_name; while (my $row = $csv->getline($fh)) { push @columns, $row->[TARGET_COL]; } print "@columns\n";
Output:
$ ./column_extract.pl b e h
— Ken
|
|---|