I hate suggesting this, but it will probably make the out-of-memory problem go away with the least effort:)
Change
my ($Size)=( (qx{dir /s "$TargetPath"})[-2]=~ /([\d,]+) bytes/ );
to
my ($Size)=( (qx{dir /s "$TargetPath" | find "bytes" })[-2]=~ /([\d,]+ +) bytes/ );
That will filter out the vast majority of the lines produced by the dir /s before they get into perl.
If you have WSH available, then you can use
use Win32::OLE; my $size = Win32::OLE->CreateObject('Scripting.FileSystemObject') ->GetFolder( $TargetPath ) ->size();
Which probably isn't hugely more efficient than dir /s in terms of doing the calculation, but it will prevent the memory problem.
You could also accumulate the directory sizes directly in perl
my (@dirs, @files) = ($TargetPath); scalar map{ push @{ (-d) ? \@dirs : \@files }, $_ } glob pop(@dirs) . '/*' while @dirs; my $size += -s for @files;
Which from my quick tests is much faster than shelling out, and seems to be quicker than using an FSO but it's difficult to measure as file system catching gets involved.
You could also use any of the many File::Find modules to achieve the same effect.
In reply to Re: Re: Re: Out of memory.
by BrowserUk
in thread Out of memory.
by blackadder
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |