I need the wisdom of PerlMonks!!! I am running a script that opens a directory and puts files that end in .html into an array. The directory contains 1 million files total, with about half of them having the .html extension. When I run my script I get "Out of Memory": Here is how I am getting pushing the files into the array:
opendir(DIR, $accumulatorDir) or die "$!\n"; my @jrnFiles = map $_, grep /\.html$/, readdir DIR; closedir(DIR);
Is there another alternative I can use that is semi-efficient? Java has no problem doing this with their "java.io.File.listFiles"
In reply to Using READDIR runs out of memory by DenairPete
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |