You are accumulating 5 complete copies of your data in memory: @cvsdata, @sorted_data; the list you feed into map here map{ $_=join ", ", @{$_}}@sorted_dat;; the list you feed into join here: join "\n",map{; and the entire thing as single huge string here: my $temp = join "\n",.
By a slight rearrangement of your code you can avoid 3 of these:
... # now sort my @sorted_dat = sort tableSorter @csvdata; @csvdata = (); ## Discard the unsorted data ## csvify and output the sorted data one line at a time ## so avoiding two more in-memory copies # arrange the sorted array into cav format print $outfile join( ", ", @{$_} ), "\n" for @sorted_data; # close the output file close($outfile);
In theory, there is an optimisation in newer versions of Perl that will sort an array in-place if the output and input arrays are the same:
@csvdata = sort tableSorter @csvdata;
But it doesn't always seem to kick in?
Beyond that, you might need to resort to using an external sort utility (eg. your system's sort utility), though you might need to pre and/or post process your files to allow it to produce the sort order you need.
In reply to Re: memory problems parsing large csv file
by BrowserUk
in thread memory problems parsing large csv file
by turquoise_man
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |