in reply to Re^2: Need help in reading csv files >1GB for converting into xlsx files in Solaris 10 - Perl v-5.8.4
in thread Need help in reading csv files >1GB for converting into xlsx files in Solaris 10 - Perl v-5.8.4
Hi ... this is the code i have used to
#use 5.18.2; use warnings; use Benchmark qw( cmpthese ); use Text::CSV_PP; use Text::CSV_XS; my ($fh, $csv); sub fh { open $fh, "<", "Proj20101111.csv"; } sub cxs { $csv = Text::CSV_XS->new ({ binary => 1, sep_char => "|" }) +} sub cpp { $csv = Text::CSV_PP->new ({ binary => 1, sep_char => "|" }) +} cmpthese (4, { perl => sub { fh; while (my @row = split /\|/ => <$fh>) {} }, c_xs => sub { fh; cxs; while (my $row = $csv->getline ($fh)) {} }, c_pp => sub { fh; cpp; while (my $row = $csv->getline ($fh)) {} }, cbxs => sub { fh; cxs; $csv->bind_columns (\my($a,$b,$c,$d,$e,$f)) +; while ( $csv->getline ($fh)) {} }, }); close($fh);
and my sample data would be like
those datas would be in GB size csv files .... :( help ! thanks !
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^4: Need help in reading csv files >1GB for converting into xlsx files in Solaris 10 - Perl v-5.8.4
by Tux (Canon) on Feb 12, 2015 at 14:25 UTC |