I recommend sleep-and-retry, for that inevitable transmission problem:
sub removeRemoteFile { my $file = shift; for (1 .. 5) { return TRUE if $sftp -> remove($file); sleep 30; } return FALSE; }
Actually, I tend to pass a hash of configuration details (hostname, username, retries, sleep time, etc.) and do this:
sub removeRemoteFile { my %args = @_; for (1 .. $args{RETRIES}) { return TRUE if $sftp -> remove("$args{FILE}"); sleep $args{SLEEP}; } return FALSE; }
I do a similar sleep and retry for any get and put, but also compare remote and local file sizes:
sub getFile { my %args = @_; my $name = $args{FILE}; $name =~ s|.*/||; for (1 .. $args{RETRIES}) { my $size; if ($sftp -> get($args{FILE},"$args{LOCAL}/$name")) { $size = -s "$args{LOCAL}/$name"; } return TRUE if defined $size && $size == $args{RSIZE}; sleep $args{SLEEP}; } unlink "$args{LOCAL}/$name" if -f "$args{LOCAL}/$name"; return FALSE; }
When removing files, I do so IF the respective put or get is successful. Something like:
for my $file ( ... ) { if (getFile( ... )) { if (!removeRemoteFile( ... )) { warn "Couldn't remove remote $file!\n" } } else { warn "Couldn't download $args{REMOTE}/$file!\n"; } }
In reply to Re: Delete files from server after downloading using SFTP
by hbm
in thread Delete files from server after downloading using SFTP
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |