99.9% of the time, I am very very very happy that Perl is managing memory for me, letting arrays or hashes grow transparently without myself having to take care of it.
I certainly do not regret the time when I had to do myself mallocs, callocs, reallocs, frees or memsets each time I wanted to use dynamic memory allocation. Well, to tell the truth, I am still using C once in a while, and that helps me knowing how happy I am using Perl instead of C most of the time. No problems with null or dangling pointers, no memory leak (except for special cases such as circular or reciprocal references), no core dump or segmentation fault (or almost never), no out of bound array, and so on and so forth, gosh, Perl is so much nicer than C.
No, I really disagree with you. If Perl were to introduce mallocs and its siblings, I would certainly go back to other dynamic languages I have been using before Perl (TCL, Python) or straight to newer ones such as Ruby and others.
Besides, I don't remember for sure and I haven't tried recently and I don't really have time right now to test, but I am not really sure that a free in a C program freeing some memory returns it to the OS. I would think that there are some OSes where it is the case, but probably not the majority of them. But I may be wrong on this last point, I don't remember having tested extensively, I usually did not have any serious data size problem at the time I was using C intensively.
| [reply] [d/l] [select] |
but I am not really sure that a free in a C program freeing some memory returns it to the OS.I stand corrected. In C too, there is no guarantee memory will be returned to the system and make the program smaller. From gnu libc free :
"Occasionally, free can actually return memory to the operating system and make the process smaller. Usually, all it can do is allow a later call to malloc to reuse the space. In the meantime, the space remains in your program as part of a free-list used internally by malloc."
So it seems that it is not just an interpreted language problem.
| [reply] |
| [reply] |
The releases you mention in those links are very specific, and generally cannot be counted on in writing a Perl script. The key word in those links is "sometimes", which is about as reliable as weather prediction. :-)
| [reply] |
Thank you everyone for your replies and comments. It has been an interesting discussion.
I actually used zentara's suggestion of forking the large memory section so that once the process ended, the memory was returned. This works well and as I was already using Parallel::ForkManager elsewhere in the code I only had to make a couple of changes to the code so thank you!
For those interested, in addition to following Anonymous Monk's advice and passing the $coverage hashref as an argument to the sub, the other changes I made to get this working as I want were as follows:
- Initially adding the use of Parallel::ForkManager and configuring a max of 1 fork
use Parallel::ForkManager;
my $fm = Parallel::ForkManager->new(1);
Inside the foreach loop identify the code to be run within the fork and initiate the fork
foreach my $t (@times){
$fm->start and next;
At the end of the loop, identify the end of the code within the fork and finish the fork. Outside the loop, block the main process until the forked process has finished.
print "what's the mem doing?\n";
sleep(10);
$fm->finish();
}
$fm->wait_all_children();
That's it. The forked process releases the memory when it finishes and the next process starts afresh.
Cheers, Rich | [reply] [d/l] [select] |