in reply to Using READDIR runs out of memory

I am running a script that opens a directory and puts files that end in .html into an array. The directory contains 1 million files total, with about half of them having the .html extension. When I run my script I get "Out of Memory":

I would not expect that to happen. Perl should easily handle an array containing a million records. Anyway, another aproach would be to iterate over the directory. Something like this:

opendir my $d,$dirname or die "Could not open $dirname: $!"; while (defined (my $item=readdir $d)) { $item=~/\.html$/ or next; work_on_the_item($item); } closedir $d;

Alexander

--
Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)

Replies are listed 'Best First'.
Re^2: Using READDIR runs out of memory
by DenairPete (Novice) on Mar 20, 2018 at 19:30 UTC

    Thanks Alexander! I dont believe its the Array storage thats the problem. It's the READDIR

      I dont believe its the Array storage thats the problem.

      Well, I don't believe in the FSM, but it may still exist after all.

      Why don't you simply test if iterating solves the problem?

      It's the READDIR

      It's called readdir, not READDIR. And its behaviour is very different when used in scalar context instead of list context. RTFM.

      Alexander

      --
      Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)