delirium has asked for the wisdom of the Perl Monks concerning the following question:

I am in the process of converting a large set of FTP shell scripts into a unified Perl system. The final product will use the Net::FTP module, and will keep login credentials, paths, and filenames in a separate config file (chmod 0600 or secured some other way). It will also call optional pre and post-processes, and keep a simple tied hash file (NDBM_File or some such) of previously downloaded filenames, sizes, and timestamps.

The key ingredients that let me sell this as a Perl project were the ability to use Net::FTP to get a directory listing and branch based on the results before QUIT-ing the ftp session, using Perl's native text processing for compliance-checking data, and using a simple database file to check for possible duplicate data. All those features have to stay in the final product, which makes me wonder how many more additional "cool" features I'll find myself unable to live without.

Im interested in seeing, before I go too far in a direction I'll regret later, how other coders have approached similar projects. Did the end product work out well? Was there a critical module that was a must-have? Did it end up a bloated nightmare that everyone hated? Did Master Foo enlighten you with a simple time-saving process that saved the day?

Thanks.

Replies are listed 'Best First'.
Re: Perl-ifying bloated ftp scripts
by princepawn (Parson) on Oct 27, 2003 at 16:37 UTC
    The final product will use the Net::FTP module, and will keep login credentials, paths, and filenames in a separate config file
    • you might also look at my Net::FTP::Common
    • login credentials can be kept in a .netrc file, resulting in automated login via Net::FTP(::Common)?
    • paths and filenames are a snap with Common. Use whatever config file solution you like

    DBSchema::Sample