in reply to Out of memory.

The problem is that you are shelling out and doing a dir /s of an entire drive in order to extract the size from the second from last line.

Even on my half empty 2GB boot partition, this results in 1.5 MB of data and 36,000 lines!

On a share that points to a well used 40GB drive with compression enabled, this could easily lead to a list of files of several 10s of MB and 200,000 lines or more. Pulling all this data across the network and into an anonymous list just to find out the amount of space used on a remote share is ... well more than just slightly profligate.

There are a number of vastly more efficient ways (in terms of memory, time and network load) of getting this information. The question has comes up at least three times in the last week or so, including this one. A Super search for "disk usage" or "share size " should turn up several more.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller

Replies are listed 'Best First'.
Re: Re: Out of memory.
by blackadder (Hermit) on Jul 23, 2003 at 09:23 UTC
    Thanks for that, but its the folder sizes is what I am after, not the drive sizes, I do get the drive sizes via a combination of AdminMisc and Netadmin, but the OLE->GetDrive method you have suggested is definitely much faster.

    I have searched for sometime for a remedy for this problem to no avail…The struggle shall continue.

      I hate suggesting this, but it will probably make the out-of-memory problem go away with the least effort:)

      Change

      my ($Size)=( (qx{dir /s "$TargetPath"})[-2]=~ /([\d,]+) bytes/ );

      to

      my ($Size)=( (qx{dir /s "$TargetPath" | find "bytes" })[-2]=~ /([\d,]+ +) bytes/ );

      That will filter out the vast majority of the lines produced by the dir /s before they get into perl.

      If you have WSH available, then you can use

      use Win32::OLE; my $size = Win32::OLE->CreateObject('Scripting.FileSystemObject') ->GetFolder( $TargetPath ) ->size();

      Which probably isn't hugely more efficient than dir /s in terms of doing the calculation, but it will prevent the memory problem.

      You could also accumulate the directory sizes directly in perl

      my (@dirs, @files) = ($TargetPath); scalar map{ push @{ (-d) ? \@dirs : \@files }, $_ } glob pop(@dirs) . '/*' while @dirs; my $size += -s for @files;

      Which from my quick tests is much faster than shelling out, and seems to be quicker than using an FSO but it's difficult to measure as file system catching gets involved.

      You could also use any of the many File::Find modules to achieve the same effect.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller

        Ew. map in void context. Hand rolled directory traversal (though that's not as bad on Win32). How about
        use File::Find; my $size; find(sub { $size += -s _ if -f }, $TargetPath);
        (Yes, File::Find is slower than hand rolled traversal. A heck of a lot less (and clearer) code, though, and still beats the dir /s approach by miles.)

        Makeshifts last the longest.

        Thanking you sir,

        The first method did not work.

        Error
        DrvTop=> D:\ZZ-TESTRESTORE (Shared by remote command.) Sizing \\u +kz423\D$\ZZ-TESTRESTORE The name specified is not recognized as an internal or external command, operable program or batch file. :
        The second method ignored sub directories. Would be usefull to me if I knew how get it to include Sub Dirs, no help in the documentaion.

        The last method failed to produce any out out. I tried printing $size and @files but got zilch display.

        So basically I am back to square root of 1.
        Thanking you kind sir, however; <be>
        This did not work
        my ($Size)=( (qx{dir /s "$TargetPath" | find "bytes" })[-2]=~ /([\d,]+ +) bytes/ );
        I got the following error;
        DrvTop=> D:\APPS () Sizing \\Server118\D$\APPSThe name spe +cified is not recognized as internal or external command, operable program or batch file. Use of uninitialized value in pattern match (m//) at C:\Scripts\shr_in +fo1.pl line 44 (#1) (W uninitialized) An undefined value was used as if it were alread +y defined. It was interpreted as a "" or a 0, but maybe it was a mi +stake. To suppress this warning assign a defined value to your variables. To help you figure out what was undefined, perl tells you what ope +ration you used the undefined value in. Note, however, that perl optimiz +es your program and the operation displayed in the warning may not necessa +rily appear literally in your program. For example, "that $foo" is usually optimized into "that " . $foo, and the warning will refer +to the concatenation (.) operator, even though there is no . in your program. Use of uninitialized value in concatenation (.) or string at C:\Scripts\shr_info1.pl line 45 (#1) :
        The OLE method worked but the values were not accurate because it ignored sub directories, would be useful if I knew how to get it to include sub dirs as well. I had a look in the documentation but there was not even a mention of GetFolder. Quick search on PM ,…nothing.

        This method seems to be the easiest for me, but only if I knew how to get the sub directories,…do I need to write a code the will recurs into sub directories?but then I am back to square 1!

        I am about to try out the final method…