in reply to Ways to limit bandwidth of downloads?
Here's one way. If you supply the hash => \*FHGLOB parameter on the Net:FTP constructor, then a "hash mark" will be printed to the Filehandle glob each buffer load. If you tie the glob before passing it, the PRINT method will be called after each buffer is read. If you insert a short sleep at that point, you will effectively throttle the request for the buffer load.
This demonstrates the technique. Performing the calculations to allow the download rate to be specified and wrapping it up into clean interface is left as an exercise for those that need it.
#! perl -slw use strict; use Time::HiRes qw[ time sleep ]; use Net::FTP; $|=1; sub TIEHANDLE { return bless [ 0 ], $_[0] } sub PRINT{ my $self = shift; if( $self->[ 0 ] ) { my $delay = ( $self->[ 0 ]+1 - time() ); printf "%f\n", $delay; sleep 1+$delay; ## Insert delay } $self->[ 0 ] = time(); } local *GLOB; tie *GLOB, 'main'; my( $site, $dir, $file ) = $ARGV[ 0 ] =~ m[ ^(?:ftp://)? ([^/]+) (/.*?) / ([^/]+$) ]x or die "Couldn't parse url"; my $ftp = Net::FTP->new( $site, Hash => \*GLOB ) or die $@; $ftp->login( 'anonymous', 'anonymous@' ); $ftp->cwd( $dir ) or die $@; $ftp->get( $file ) or die $@;
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Ways to limit bandwidth of downloads?
by Tanktalus (Canon) on Apr 18, 2006 at 16:45 UTC | |
by BrowserUk (Patriarch) on Apr 18, 2006 at 16:59 UTC |