As you can see it basically takes the contents of a directory and puts them in either a directory array or a file array. It differentiates between the two by checking to see if the FTP->cwd("tested item") (change directory) works. It then takes everything in the directory array and recursivly calls itself on that directory. As is it works fine, but for large sites (4000+ files and driectores) it takes a LONG time, and is quite server transaction intensive. Is there a better way to do this? I looked in the NET::FTP docs and I don't see anything that really fits the bill. I would like to know if there is a way to read file attributes on an FTP site. Any one?#! /usr/bin/perl -w use strict; use Net::FTP; my $ftp = Net::FTP->new("ftp address here"); print "Content-type: text/html\n\n"; print "<html><title>FTP Contents</title><center><Head>The Tank FTP Lis +t</head></ print "<body bgcolor=black>"; if($ftp->login("username",'password')){ lister(0); } else{ print "FTP: Failed to login"; } $ftp->quit; print "</body></html>"; #------Subroutines------------ sub lister{ my $lev = $_[0]; my @dirs; my @files; my @dirarray = $ftp->ls(); foreach my $item(@dirarray){ if($ftp->cwd($item)){ $ftp->cdup(); push @dirs,$item; } else{ push @files,$item; } } my $buff="   " x $lev; foreach my $dirs(@dirs){ print "<font color=yellow>$buff\|_ $dirs</font><br>"; $ftp->cwd($dirs); lister($lev+1); } foreach my $file(@files){ print "<font color=lime>$buff\|_ $file</font><br>"; } $ftp->cdup(); }
In reply to A better way to recurse FTP directories by Snuggle
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |