short scenario:
I have noticed with perl on OSX that when I close a filehandle, the memory does not get completely deallocated, so when my perl script opens and closes lots of files (extremely large files) the memory doesn't get deallocated, the VSIZE keeps growing, and eventually I get a malloc error. Any ideas?
long detailed scenario:
I am sorting extremely large files (>2Gig) on a MACOSX. I take a chunk of the file (usually about 100000 lines) sort them, store them in a tempfile. take another chunk of the file, sort those lines, and then merge them with the tempfile to another tempfile. repeat ad nauseum. I use two temp files and toggle between reading and writing to them. what I am doing now, is everytime I want to merge my sorted lines with a tempfile I open both temp files (one for reading, one for writing) and then close both of them before grabbing the next chunk of lines. for some reason closing the files isn't cleaning out memory the way it should, so if I keep this up, I get a malloc error, and then the system panics (note: this doesn't happen on linux). I am trying to work it so I can leave the files open, but on OSX the O_RDWR|O_CREAT flags for sysopen aren't actually letting me read the file, just write to it, same with the +>> for open. any ideas?