RaduH has asked for the wisdom of the Perl Monks concerning the following question:
At some point in my Perl script I am downloading a huge file (a tgz 500MB to 1GB in size) using a form of secure copy (I have used Net:SSH2, SFTP, right now I am doing a scp inside a system call, not sure what the final variant will be). Because this happens on a remote computer (on my computer my script invokes a script on a remote computer and that script is performing the download) and because this does take a while, I would like to be able to send back to my computer progress reports (say every 10% of the download).
Using scp there's real-time feedback on the screen and thus I can follow the download percent by percent but this happens on the remote computer and not where I can see it. Any ideas how I could solve this problem efficiently? I can spin in a loop and check the size of the file against the size of the entire file (which I am sure should be easy to figure) but this seems like a lot of spinning, especially for a 1GB tarball. Any other options out there and I don't know about? It doesn't have to be scp, any king od secure copy will do. In case it makes a difference, this does not run in a webpage, it's simple, plain Perl, no other strings attached
Thanks!
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Long download progress report
by Corion (Patriarch) on Nov 16, 2007 at 17:06 UTC | |
|
Re: Long download progress report
by gamache (Friar) on Nov 16, 2007 at 17:19 UTC | |
|
Re: Long download progress report
by BrowserUk (Patriarch) on Nov 16, 2007 at 19:24 UTC |