Perhaps not related to the speed problem: Your code may open a lot of directory handles while recursing, blocking some resources. File and directory handles should generally be treated as a limited resource. Your code should close the directory handle BEFORE recursing into the filesystem, not after. You could read the entire directory contents into an array, close the handle, and then iterate over the array.
An even better way would be a queue (think of it as a to-do-list) instead of using recursion. The queue is a simple array that starts with the directory to be "explored". While the array is not empty, shift out the first element and use it as a directory name to open a directory handle. Read all directory elements, push subdirectories to the array, handle non-directories directly in the loop. Close the directory handle.
Perhaps related to the speed problem: If the output is initially fast and slows down over time, you are leaking resources, forcing the system to start swapping. Your code uses close instead of closedir to close $DIR. close can not close the directory handle. You would have noticed that if you had added proper error handling (... or die "Can't close: $!" or autodie):
>perl -Mstrict -w -e 'opendir my $dir,"." or die "opendir: $!";close $ +dir or die "close: $!";' close: Bad file descriptor at -e line 1.
Alexander
In reply to Re: print all files is soo slow! Why?
by afoken
in thread print all files is soo slow! Why?
by harangzsolt33
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |