in reply to Using READDIR runs out of memory
I am running a script that opens a directory and puts files that end in .html into an array. The directory contains 1 million files total, with about half of them having the .html extension. When I run my script I get "Out of Memory":
I would not expect that to happen. Perl should easily handle an array containing a million records. Anyway, another aproach would be to iterate over the directory. Something like this:
opendir my $d,$dirname or die "Could not open $dirname: $!"; while (defined (my $item=readdir $d)) { $item=~/\.html$/ or next; work_on_the_item($item); } closedir $d;
Alexander
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Using READDIR runs out of memory
by DenairPete (Novice) on Mar 20, 2018 at 19:30 UTC | |
by afoken (Chancellor) on Mar 20, 2018 at 19:42 UTC |