I'm having a problem with my script. I'm opening up a handle to sql ldr with a pipe like this:
I'm reading a file of 1 Million+ records that contain very large pieces of text. This script keeps crapping out after about 1 million records. It stopps before I get to completely finish loading all records. I am keeping track of the mem usage in the taskbar and it keeps allocating more mem with every min that it runs. After a few hours it gets to > 1GB. I can't see any other areas in the script that would be allocating this much and not freeing anything. I have a feeling that I need to flush the $sqlldr file handle buffer after each print statement, but I'm not sure. Does anyone know how to do this (flush the $sqlldr buffer after each print statement)? Is there any way to look at what part of my perl script is using what in mem?