in reply to memory problems parsing large csv file

You are accumulating 5 complete copies of your data in memory: @cvsdata, @sorted_data; the list you feed into map here map{ $_=join ", ", @{$_}}@sorted_dat;; the list you feed into join here: join "\n",map{; and the entire thing as single huge string here: my $temp = join "\n",.

By a slight rearrangement of your code you can avoid 3 of these:

... # now sort my @sorted_dat = sort tableSorter @csvdata; @csvdata = (); ## Discard the unsorted data ## csvify and output the sorted data one line at a time ## so avoiding two more in-memory copies # arrange the sorted array into cav format print $outfile join( ", ", @{$_} ), "\n" for @sorted_data; # close the output file close($outfile);

In theory, there is an optimisation in newer versions of Perl that will sort an array in-place if the output and input arrays are the same:

@csvdata = sort tableSorter @csvdata;

But it doesn't always seem to kick in?

Beyond that, you might need to resort to using an external sort utility (eg. your system's sort utility), though you might need to pre and/or post process your files to allow it to produce the sort order you need.


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
RIP PCW It is as I've been saying!(Audio until 20090817)

Replies are listed 'Best First'.
Re^2: memory problems parsing large csv file
by turquoise_man (Initiate) on Aug 24, 2009 at 14:18 UTC
    Thanks for the replies guys. I suspect I will need a mixture of them all ! I will try your suggstions and post back when (if !) its sorted