I need to FTP two files from my web server to a remote server periodically as I explained here: http://perlmonks.org/?node_id=702448
The problem I'm having is that execution from an shtml page times out occasionally which interrupts the process. Since I'm tranfering two files, I decided to see if I can shorten the execution time by doing the transfer in parallel rather than sequentially. I have written the following script to try to accomplish this:
The script seems to save about 50% of the execution time. And early indications are that this script seems to have solved my problem. My previous script had the two transfers in sequence.#!/usr/bin/perl if (fork) { use Net::FTP; $ftp = Net::FTP->new("ftp.remoteserver.com", Passive => 1, Timeout + => 300, Debug => 0) or die "Cannot contact $host: $!"; $ftp->login("user1",'password1'); $ftp->binary(); $ftp->put("/home/users/web/blah/directory/file1"); $ftp->quit; } else { $ftp = Net::FTP->new("ftp.remoteserver.com", Passive => 1, Timeout + => 300, Debug => 0) or die "Cannot contact $host: $!"; $ftp->login("user2",'password2'); $ftp->binary(); $ftp->put("/home/users/web/blah/directory/file2"); $ftp->quit; } print "Content-type: text/html\n\n"; print <<"EOF"; <font face="arial" color=red> Transfer Complete! </font> EOF exit;
Since I'm very new to Perl scripting, after you have a look please answer the following simple questions for me:
- Does the script actually produce parallel transfer?Curiously, the script sends "Transfer Complete! Content-type: text/html Transfer Complete!" to my web page after completion, however, I believe I now understand why this is true. Thanks as always for your advice.
- Can you suggest any improvement?
In reply to Shortening Execution Time by DrWho_100
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |