in reply to Re^2: Bandwidth limiting for file downloads: What can Perl do?
in thread Bandwidth limiting for file downloads: What can Perl do?

The following program downloads a file very slowly. Maybe this gets you started.

#!/usr/bin/perl -w use strict; use WWW::Mechanize; use 5.020; use feature 'signatures'; no warnings 'experimental::signatures'; my $mech = WWW::Mechanize->new(); my $large_url = 'http://ftp.acc.umu.se/mirror/wikimedia.org/dumps/dewi +ki/20220420/dewiki-20220420-abstract.xml.gz'; $| = 1; my $read_size = 0; $mech->get( $large_url, ':read_size_hint' => 4096, ':content_cb' => sub { $read_size += length( $_[0] ); my $len = length($_[0]); print "\r$len - $read_size bytes"; # discard the content sleep 1; }, ); say 'done';

Replies are listed 'Best First'.
Re^4: Bandwidth limiting for file downloads: What can Perl do?
by Polyglot (Chaplain) on May 01, 2022 at 01:45 UTC
    So why might that work with WWW::Mechanize and not with LWP::UserAgent which is supposed to accept the same callbacks? Is this a bug in the latter? With the Mechanize, it did reduce the bandwidth considerably, but at the end of 10+ minutes I had no file. How does one keep the file, too?

    Blessings,

    ~Polyglot~

      The same code should work with LWP::UserAgent, because WWW::Mechanize inherits from it.

      If you want to keep the file, you write the data in the callback instead of just printing the length of the data you received.