There are many ways, ranging from easy to hard.
- Use rsync - it makes keeping two directories in sync very easy and doesn't waste bandwidth. It works over rsh and ssh painless and there are clients for both Win32 and "Unix". Perl is not necessarily involved, but there also is Net::RsyncP.
- Use scp to transfer the files. Very easy, and Perl is not involved, but there also is Net::SSH and some SCP module.
- Use Net::FTP to transfer your files via ftp. This is less secure because the login and password will be transferred in clear text over the network, but may be a solution.
- Use LWP and serve your files on the Unix box via a web server. This restricts the access even less than ftp.
- Write your own server and client in Perl to transfer the files. This is the most ugly and overkill solution, but if you really are in need, a simple netcat-style server and client can pipe a .tar archive over a socket connection easily, and both server and client are typed in in about 10 lines of Perl.
I would go and research the possibilities in the order as I presented them, but if you're really desperate, look at my netcat-style file pipes (sender and receiver) below:
The sender (called nc for hysterical raisins):
#!/usr/bin/perl -w
use strict;
use IO::Socket;
select(IO::Socket::INET->new(PeerAddr => shift, PeerPort => shift) or
+die "Couldn't connect: $!");
print
for <>;
The receiver (called cn for reasons obvious):
#!/usr/bin/perl -w
use strict;
use IO::Socket;
# reverse netcat
my $s = IO::Socket::INET->new(LocalAddr => 'sfsifc53', LocalPort => 66
+66, Listen => 1)
or die "Couldn't connect: $!";
my $c = $s->accept;
print
while <$c>;
Most likely, the code above could be vastly simplified by using IO::All, but I haven't used that module yet.