nosbod has asked for the wisdom of the Perl Monks concerning the following question:

Dear Monks, I have users that up load single files via the browser using the standard 'browse' button, text field and multi-part forms.
One of the little pests is lazy and doesn't like loading up each file individually. Granted he does have 20-30 files to upload at a time.
As far as I know there's no way to upload a directory of files. I've no problem serially uploading them once they've been selected but how can i make selecting the files easier?

Any thoughts?

Replies are listed 'Best First'.
Re: multiple file upload and CGI
by nobull (Friar) on Jan 25, 2005 at 12:31 UTC
    Without special client side software your simplest option is to allow your application to accept archive files such as ZIP or TAR files.

    OB-perl: Archive::Extract may be useful.

      yes, thanks, i think this may be the way to go.
Re: multiple file upload and CGI
by bradcathey (Prior) on Jan 25, 2005 at 12:37 UTC

    nosbod, I have struggled with this myself and believe the limitation is in the HTML enctype="multipart/form-data" capability. My only "solution" is to offer the user a bunch of <input type="file"... fields on the form and then loop through my upload function in my Perl. I've seen several other sites do this as well, often asking for the number of files to upload first and then presenting as many file fields on a refreshed form.

    OT: I also find it interesting that I can do nothing with this field on the client side, pre-server validation with javascript for e.g.


    —Brad
    "The important work of moving the world forward does not wait to be done by perfect men." George Eliot
      OT: I also find it interesting that I can do nothing with this field on the client side, pre-server validation with javascript for e.g.

      I too have been bitten by this, but believe me it's a good idea. If this were not true, then any website you go to with Javascript turned on could have an invisible frame with a form and a file input button. Then a Javascript function could set the value to a known file location and submit it, and presto, some vital file from your hard drive has been whisked off to a server you don't know. I guess it could be less dangerous if it were read-only from scripts but even still...

        Even if it's read-only, it could potentially give the webpage at least knowledge of your file tree. And this is sometimes too much information.

        a Javascript function could set the value to a known file location and submit it, and presto, some vital file from your hard drive has been whisked off to a server you don't know.

        This hypothetical is no different from what theoretically could happen with the current DOM-integrated file upload form element. As it happens, secure modern JS implementations do the right thing and do not allow the behavior you describe. I am no big JS apologist but I happen to be reading O'Reilly's "Javascript: The Definitive Guide" and this specific issue was just covered.

        I would welcome adding directory-upload functionality to client-side JS.

Re: multiple file upload and CGI
by samizdat (Vicar) on Jan 25, 2005 at 13:10 UTC
    My HO is that having three linked forms with
    1. a file selector panel that copies the files to a temp directory
    2. a tar-gzip-er,
      and then,
    3. an upload-the-tgz-file button on a MIME multi-part/form-data page

    is the easiest and most usable solution. Since even Doze zunzip programs now understand tgz, you're not harming his productivity (or our Internet bandwidth) as much as individual uploads would.
Re: multiple file upload and CGI
by hellomoto (Novice) on Jan 25, 2005 at 22:42 UTC
    Hello! Check this out!
    (upload a zipped files)

    HTML Form:
    <form action="upload.pl" method="post" enctype="multipart/form-data"> <input type="File" name="file"> </form>
    Perl code ex:
    #!/usr/bin/perl use CGI; use Archive::Zip qw( :ERROR_CODES :CONSTANTS ); my $q = new CGI; my $save_dir = "/home/user/www/upload_dir"; my $file = $q->param("file"); open(OUTFILE, ">$save_dir/file.zip") || die "can't create file: $!"; while (read($file, $buffer, 1024)) { print OUTFILE $buffer; } close(OUTFILE); chmod (0666, "$save_dir/file.zip"); my $file_patch = $save_dir . "/file.zip"; my $zip = Archive::Zip->new($file_patch); my @files = $zip->members(); foreach (@files) { my $unzip_file = $_->fileName; $zip->extractMember($unzip_file); if ($zip->extractMember($unzip_file) != 0) { die print "Extraction of $file failed: $!"; } }
Re: multiple file upload and CGI
by superfrink (Curate) on Jan 25, 2005 at 19:13 UTC
    There are a few options (as others have pointed out):
    1. Use a Zip tool and upload one file.
    2. Use more than one FILE field but then you lose directory structure.
    3. Use a tool like Rad Upload (requires Java on the client and I think costs $20).
Re: multiple file upload and CGI
by nosbod (Scribe) on Jan 25, 2005 at 13:17 UTC
    thanks all.

    He's going to have to zip them up I think

Re: multiple file upload and CGI
by nothingmuch (Priest) on Feb 01, 2005 at 11:03 UTC
    The gallery project has some solutions for this problem... They don't solve the problem, but work around it, in various ways. The most convenient interface is actually a java applet.

    -nuffin
    zz zZ Z Z #!perl