Hi Everyone !! I have a perl code which converts some pipe delimited csv files to xlsx files for our Unix box Solaris sparc server ! But the prob am stuck in is .. it is working well with files in MBytes but cant in GBytes size ending by sayin out of memory [RAM Size - 16GB, Perl v-5.8.4in-built version of solaris 10 and we could not upgrade it since its an client server 64bit-Solaris-OS] ! since the volume of our production data is large... we need this to be automated thro perl . Please suggest is there any solution for this prob ! this is the code i am using!!
use Excel::Writer::XLSX; my $workbook = Excel::Writer::XLSX->new("$PathAndFile1.xlsx"); my $worksheet = $workbook->add_worksheet(); $worksheet->set_column('A:ED', 30); open(FH,"<$PathAndFile"); my ($row,$col) = (0,0); while (<FH>) # line-by-line it seems { chomp; my @list = split(/\|/,$_); foreach my $c (@list) { $worksheet->write($row, $col++, $c); $worksheet->freeze_panes( 1, 0 ); } $row++;$col=0; } close(FH); $workbook->close();
please help ! Thanks in advance !
In reply to Need help in reading csv files >1GB for converting into xlsx files in Solaris 10 - Perl v-5.8.4 by GT Learner
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |