I've done the "Super Search"; I've searched online; and apart from the "common sense" suggestions such as compressing the files before transfer, the "best" suggestions all led away from a pure-Perl solution; e.g. use wget or rsync or cURL, etc. A few stray suggestions involved hacking Net::FTP, and adding a "sleep" command to the code.
Links to some of those prior discussions:
But most of those suggestions are nearly decades old. Even shorewall-perl never made it simple or easy, as the documentation for it explains here: Traffic Shaping/Control -- and, of course, shorewall worked by configuring linux's iptables.
What can one do with Perl now? Is it possible to rate-limit one's downloads via a simple Perl command or module so as to play nicely with the server's resources?
Blessings,
~Polyglot~
In reply to Bandwidth limiting for file downloads: What can Perl do? by Polyglot
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |