chuckd has asked for the wisdom of the Perl Monks concerning the following question:

I'm having a problem with my script. I'm opening up a handle to sql ldr with a pipe like this:
open($sqlldr, "|sqlldr args1 args2 blah blah") or die "blah blah";
foreach (<record from dat file>) {
print $sqlldr "$_\n";
}
close($sqlldr) or die "blah blah";
I'm reading a file of 1 Million+ records that contain very large pieces of text. This script keeps crapping out after about 1 million records. It stopps before I get to completely finish loading all records. I am keeping track of the mem usage in the taskbar and it keeps allocating more mem with every min that it runs. After a few hours it gets to > 1GB. I can't see any other areas in the script that would be allocating this much and not freeing anything. I have a feeling that I need to flush the $sqlldr file handle buffer after each print statement, but I'm not sure. Does anyone know how to do this (flush the $sqlldr buffer after each print statement)? Is there any way to look at what part of my perl script is using what in mem?

Replies are listed 'Best First'.
Re: perl is allocating massive memory
by chromatic (Archbishop) on Jan 27, 2009 at 23:29 UTC

    Change foreach to while, to start.

    If that doesn't fix things, only then worry about flushing buffers (which are probably 4k, so not the source of your memory problems.)

    Internally, foreach builds the entire list in memory and only then starts processing it. while processes it a record at a time. The latter is almost always the way to go when reading from a file.