in reply to file globbing in a nested while loop

Another note to take is that <*/> isn't a very clean way to find subdirectories; a better way is to use the -d filetest operator. I would propose something akin to this (untested):
. . local $_; # always a good habit unless you know why don't want it chdir $Dir; opendir DIR, '.'; my (@dirs, @files); for (readdir DIR) { next if /^\.\.?$/; # we don't want to catch the . and .. entries push @dirs, $_ if -d and not -l; # only non-symlink dirs please push @files, $_ if -f and /\.ext$/; # -f because it could be a dir +ectory or other non-file called "something.$ext" } closedir DIR; . .
Afterwards you have the directories and desired files in the appropriate arrays. A note for completeness' sake is that using readdir() like this can be noticably slower than a glob when directories contain a lot of files (as in several thousand). However unless you're running a heavy load application and this piece of code is among your script's bottlenecks, it won't make a difference while it is more maintainable and less prone to errors IMHO.

But anyway - this was just an excercise, for real work, you should indeed rely on File::Find. :)