rizzy has asked for the wisdom of the Perl Monks concerning the following question:
Basically, I open up the directory and get a list of every file in the directory. Next, each file is individually opened and passed as a string to $html. I immediately undefine the string and repeat the loop. I can't understand why the memory isn't freed up. It should actually be freed up in 3 places each loop, shouldn't it? (1) when I define $html as '' (2) when I slurp the contents of the next file to $html and (3) when I undefine $html.#!C:/Perl/bin -w use File::Listing qw(parse_dir); my $dir='c:/mydir/'; #open the directory and get filenames; opendir(TEMP, $dir) || die("Cannot open directory"); @thefiles= readdir(TEMP); closedir(TEMP); $maxsize=0; #cycle through each of the files; foreach $file (@thefiles) { unless ( ($file eq ".") || ($file eq "..") ) { $filesize = -s $dir.$file; if ($filesize > $maxsize){$maxsize=$filesize} print "$file - $maxsize - $filesize\n"; my $html=''; $slurpfile=$dir.$file; open( my $fh, $slurpfile ) or die "couldn't open\n"; my $html = do { local( $/ ) ; <$fh> } ; undef $html; } }
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Memory Leak when slurping files in a loop (sliding window explained)
by LanX (Saint) on Dec 07, 2010 at 11:16 UTC | |
by rizzy (Sexton) on Dec 07, 2010 at 16:49 UTC | |
by rizzy (Sexton) on Dec 07, 2010 at 16:42 UTC | |
Re: Memory Leak when slurping files in a loop
by ww (Archbishop) on Dec 07, 2010 at 04:36 UTC | |
by Anonymous Monk on Dec 07, 2010 at 04:46 UTC | |
Re: Memory Leak when slurping files in a loop
by Anonymous Monk on Dec 07, 2010 at 04:24 UTC |