hi
I was returning a 1 "return 1;", I have changed it to "return;" I will see what happens. I also have tried undef'ing many of the data structures within the sub after I am done with them and before the sub returns. The funny thing is that in this sub, I am printing info to stdout and if I comment out of some of these print statements and then run, it works ok and completes, but when I print everything I need to STDOUT it dies. | [reply] |
That's a peculiar sounding behavior. Is there any way you can reproduce the problem in a suitably short and nonproprietary snippet?
Returning a one should mean it wasn't the problem I suspected, BTW. Leaving that in place instead of using a bare return should be fine. The usual reason for such a memory jump is copying some sort of large data structure, so you might want to look for other reasons that might happen.
Another possible scenario just popped into my head, but it's pure speculation. Are you perchance calling the subroutine inside a while ( <filehandle> ) { .. } loop? If you are and it's a large binary file, it might be hundreds of megabytes before a newline appears. Again, a reproduction of the problem in a short snippet that doesn't give up any private data would be nice.
| [reply] [d/l] |
YES! YES! I am!
I have something like this:
while (<$DATFILE>) {
%hashFile = ();
$hashFile{someKey} = someValue;
$hashFile{someKey} = someValue;
$hashFile{someKey} = someValue;
$hashFile{someKey} = someValue;
InsertData(\%hashFile);
$cnt++;
}
and some of these values are very large e.g. text from files. This hash is very large at times.
But the frustrating part is that it doesn't blow it up until after like the 1 millionth increment from the while loop. And it blows up as soon as I hit the $cnt++ incrament. So I can return from the subroutine ok and step to $cnt++, but as soon as I step past $cnt++ it blows up the memory. I'm going to change the filehandle from a type glob to a regular filehandle. I don't think it will help but I'll see.
| [reply] |