I am running a script that opens a directory and puts files that end in .html into an array. The directory contains 1 million files total, with about half of them having the .html extension. When I run my script I get "Out of Memory":
I would not expect that to happen. Perl should easily handle an array containing a million records. Anyway, another aproach would be to iterate over the directory. Something like this:
opendir my $d,$dirname or die "Could not open $dirname: $!"; while (defined (my $item=readdir $d)) { $item=~/\.html$/ or next; work_on_the_item($item); } closedir $d;
Alexander
In reply to Re: Using READDIR runs out of memory
by afoken
in thread Using READDIR runs out of memory
by DenairPete
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |