Sure. I ran the following command which creates a 1 million key HoHoH, and then pauses:
perl -e"$h{ $_ }{1} = { 1 .. 4 } for 1..1e6; <>"
I then looked at my process monitor and saw that it required 600MB of memory, so I multiplied by 3 (the number of files) to come up with ~1.8GB. I've made a lot of assumptions. eg. that the chromosomes within the 3 files substantially overlap. I could have multiplied by 5 and arrived at 3GB for example. My guess is that your actual requirement will be somewhere between.
But either way, the total memory requirement is very unlikely to seriously threaten your 50GB maximum, so there was no need to suggest that you change your methods.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
|