I am trying to make Perl read xls files and save them as a tsv .txt file(wich will be target of many checks, reads and at last transfer to mysql). But I am having a hard time with worksheets bigger than 5 thousand lines. With them the script just crashes and stops answering - with the others it goes ok, even that it is not fast at all.
Anyone of you has experience with large data manipulation in this task? Does the module in fact have this problem? Hereīs my code:
my $oExcel = new Spreadsheet::ParseExcel; my $file = "test4.xls"; my $oBook = $oExcel->Parse($file); my($iR, $iC, $oWkS, $oWkC); my @thisrow; # to store the row for latter processing # just the first worksheet $oWkS = $oBook->{Worksheet}[0]; for ( $iR = $oWkS->{MinRow} ; defined $oWkS->{MaxRow} && $iR <= $oWkS->{MaxRow}; $iR++) { @thisrow = (); for ( $iC = $oWkS->{MinCol}; defined $oWkS->{MaxCol} && $iC <= $oWkS->{MaxCol}; $iC++) { $oWkC = $oWkS->{Cells}[$iR][$iC]; if ($oWkC) { push (@thisrow, $oWkC->Value); print $thisrow[0]; # this last one ir just to test; here would # go the print to the .txt file handler } } }
I also know that I could use the unix shell to do that, but I asked the guys at my host and they said they never heard of a command line converter for that. Donīt know if it was the guy that didnīt know, or that itīs realy absent there...
Thanks a lot,
André
In reply to Spreadsheet::ParseExcel crashes above 5K lines by Andre_br
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |