Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

This node falls below the community's threshold of quality. You may see it by logging in.

Replies are listed 'Best First'.
Re: download new files from remote server
by thabenksta (Pilgrim) on Jul 02, 2001 at 22:22 UTC

    This depends on many things. First of all, what does this question have to do with Perl, are you wanting to write a Perl script that does this? Second, what are you means of transfer? FTP? SCP?

    I'm afraid you are going to have to be much more specific before we can answer your question.

    -thabenksta
    my $name = 'Ben Kittrell'; $name=~s/^(.+)\s(.).+$/\L$1$2/g; my $nick = 'tha' . $name . 'sta';
      Yes, I have to write a perl script to do so - which I am not adept at. I am familiar with FTP and SCP and can type that stuff by hand on the command line, but I'd like to just run a perl script that does all that. 1. Access server X. 2. Locate new files within X:/ 3. Copy/ftp/download new file(s) to my dir 4. run Java program on file
        It all depends how you access the other machine. The simplest would be an NFS mount, and the use of find, with the -exec option.

        Another simple way would be the use of rsync, which produces a list of files it downloads. You can read the output of rsync in a pipe, and then run each Java program.

        FTP would be harder. You'd have to browse the directory tree yourself, checking all time stamps, and downloading the files you are interested in, which you then can run as Java programs.

        HTTP would even be harder, because it's often not possible to browse, or to get timestamps.

        UUCP could be an option as well. Just execute remote commands, wait for the results to come back, decide what you want to have, send more remote commands (to pack a file), fetch it, unpack, execute Java.

        cpio could of course be an option as well. Or mail. Or samba. Or sneakernet.

        -- Abigail