The optimization used by dont_use_nlink => 0 relies on a feature not available on FAT and NTFS file systems, and you appear to be using such a file system.
The optimization only helps if you have empty directories (i.e. directories with no files except . and ..). If you have few such directories, the optimization isn't helping anyway. If you have many such directories, maybe you could delete them in advance to gain the same benefit as the optimization.
An explanation of the optimization:
On unix file systems, a directory's . is a hardlink to itself, and a directory's .. is a hardlink to its parent directory. So when you stat a directory, the link count returned by stat will be at least 1 (name) + 1 (.) + $num_sub_dirs (..).
$ ls -ld . drwx------ 5 ikegami ikegami 46 Dec 16 12:03 . # 5: Up to 3 subdirs $ ls -l . total 0 drwx------ 2 ikegami ikegami 10 Dec 16 12:03 a # 2: Empty drwx------ 3 ikegami ikegami 24 Dec 16 12:03 b # 3: Up to 1 subdir drwx------ 2 ikegami ikegami 10 Dec 16 12:03 c # 2: Empty
File::Find relies on that information to optimize itself when possible.
Perl and File::Find know this isn't the case for the FAT and NTFS file systems, so the optimization is disabled on Windows.
In reply to Re: File::Find won't iterate through CIFS share withouth "dont_use_nlink"
by ikegami
in thread File::Find won't iterate through CIFS share withouth "dont_use_nlink"
by reinaldo.gomes
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |