DenairPete has asked for the wisdom of the Perl Monks concerning the following question:
I need the wisdom of PerlMonks!!! I am running a script that opens a directory and puts files that end in .html into an array. The directory contains 1 million files total, with about half of them having the .html extension. When I run my script I get "Out of Memory": Here is how I am getting pushing the files into the array:
opendir(DIR, $accumulatorDir) or die "$!\n"; my @jrnFiles = map $_, grep /\.html$/, readdir DIR; closedir(DIR);
Is there another alternative I can use that is semi-efficient? Java has no problem doing this with their "java.io.File.listFiles"
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Using READDIR runs out of memory
by afoken (Chancellor) on Mar 20, 2018 at 19:21 UTC | |
by DenairPete (Novice) on Mar 20, 2018 at 19:30 UTC | |
by afoken (Chancellor) on Mar 20, 2018 at 19:42 UTC | |
|
Re: Using READDIR runs out of memory
by Marshall (Canon) on Mar 20, 2018 at 20:46 UTC | |
|
Re: Using READDIR runs out of memory
by Anonymous Monk on Mar 20, 2018 at 19:18 UTC |