I have a requirement to transfer some large (20GB) media files from desktop PCs to a remote FTP server. I need to authenticate the users using client certificates. Neither SFTP or FTP over SSH is an option. So I would like to do is something like:
a Perl process runs inside apache as CGI or mod_perl. Cert auth is handled by apache itself before the perl code is started.
The Perl process:
Opens an FTP connection to the remote host
Reads n bytes of base64 encoded data from the client HTTP connection (I don't want these files buffered to disk because they are so large)
base64::decode() the data, write it to the FTP connection.
rinse and repeat till the entire file is transferred, terminate the HTTP connection. Terminate the FTP connection.
So my questions:
is this possible (I'm guessing yes):
Can you receive HTTP file transfers without them being saved to disk?
Can you upload data over FTP without it coming from disk?
What would be the best modules for doing this?
What are the scalability issues with something like this, I expect maybe a maximum of 50 concurrent uploads.