in reply to Re^2: Out of Memory when generating large matrix
in thread Out of Memory when generating large matrix

Delegating it to sort - be it the shell command or the Perl built-in - is a waste of resources, which doesn't scale well.

Except of course once the hash has become too large for memory. Then a disk-based mergesort (well, mergecount maybe) is the only thing that will still scale, provided that you have enough disk space to write the merged results.

  • Comment on Re^3: Out of Memory when generating large matrix

Replies are listed 'Best First'.
Re^4: Out of Memory when generating large matrix
by LanX (Saint) on Mar 06, 2018 at 10:38 UTC
    Sure, but the OP is talking about counting thru 197x8000 records, that's at max 1.5 million hash entries when every entry is just counted once, IIRC that'll result in 150 MB RAM (at max, that's a pathological case)

    IF the RAM wasn't even sufficient for counting, presorting a giant file wouldn't help. (wc could help but an output with 1.5 million colons should be avoided...)

    I'd surely opt for a DB like SQLite.

    But I suppose only some thousands of the most frequent K-mers are of interest.

    (I can imagine a solution with Hash Of Hashes for counting. Only the most relevant hashes are kept in memory while the others are swapped out, but I this would extent the scope of this thread.)

    Cheers Rolf
    (addicted to the Perl Programming Language and ☆☆☆☆ :)
    Wikisyntax for the Monastery