mldvx4 has asked for the wisdom of the Perl Monks concerning the following question:

Esteemed Monks,

I have the short script below for receiving files over HTTPS and saving them in a directory. It works great for files of a few hundred kB or less in size. However, for files around 50MB or larger, the script times out without saving the files.

I've tried increasing the timeout for the HTTP daemon (OpenBSD's httpd) but that shouldn't be needed, the upload should take only a second or so with these network speeds, as with SFTP.

#!/usr/bin/perl use CGI::Fast; use CGI::Carp qw(fatalsToBrowser); use strict; use warnings; my $socket = '/var/www/run/sockets/upload.sock'; # max file upload size MB $CGI::POST_MAX = 1024 * 1024 * 500; $ENV{FCGI_SOCKET_PATH} = $socket; $ENV{FCGI_SOCKET_PERM} = 0775; while (my $q = CGI::Fast->new) { print qq(Content-Type: text/html; charset=utf-8\n\n); print qq(<!DOCTYPE html>\n); print qq(<html xmlns="http://www.w3.org/1999/xhtml">\n); my $head = &head_default; print qq(<head>\n$head\n</head>\n); my $body; if ( $q->param && $q->request_method() eq 'POST') { $body = &upload($q); } else { $body = &body_default; } print qq(<body>\n$body\n</body>\n); print qq(</html>\n); } exit(0); sub head_default { my $css = &css; my $head = <<"EOH"; <title>Hello, World</title> $css EOH return($head); } sub css { my $css=<<EOC; <style type="text/css" media="screen"> /*<![CDATA[*/ BODY { font-family: sans-serif; margin: 0; } H1 { font-size: 150%; font-weight: bold; font-family: serif; background-color: #a080ff; padding-left: 1em; padding-right: 1em; border-right: thin solid #000000; border-left: thin solid #000000; } P.update { clear: both; font-size: 60%; text-align: center; margin-left: 0.5em; margin-right: 0.5em; } FORM { border: thin solid #000; } FORM > LABEL:has(~ DETAILS[open]) { display: none; } } /*]]>*/ </style> EOC return($css); } sub body_default { my $body; $body .= <<EOB; <h1>Upload a File</h1> <p></p> <form method="post" enctype="multipart/form-data"> Select a file to upload: <input type="file" name="fileToUpload" id="fileToUpload"> <br /> &nbsp; <br /> <input type="submit" value="Upload File" name="submit"> </form> EOB return($body); } sub upload { my ($q) = (@_); if (!$q->param('fileToUpload')) { return(0); } my $file = $q->param('fileToUpload'); $file =~ s/\s+/ /g; $file =~ s/[^\w\ \_\.\-]+//g; if (! $file) { return(0); } my $upload_dir = "/var/www/uploads"; my $upload = $q->upload('fileToUpload'); open ( my $ufile, ">", "$upload_dir/$file" ) or die "$!"; binmode($ufile); while ( my $data = <$upload> ) { print $ufile $data; } close ($ufile); my $body = <<EOB; <p> The file <b>$file</b> was uploaded successfully. <br /> Press the <b>back</b> button or close this tab. </p> EOB return($body); }

Again, it seems to work fine with smaller files. What should I look at tweaking in order to upload larger files? Any other tips welcome, too.

Replies are listed 'Best First'.
Re: CGI::Fast timing out with uploads of large files not small ones
by Marshall (Canon) on Sep 17, 2025 at 00:29 UTC

      Thanks. That was informative and I tried some experiments with different settings, but nothing changes in regards to the large file uploads timing out.

      I guess a work-around is to use SFTP, but I hope to eventually figure out a solution to the WWW upload form.

      Could the problem be the httpd daemon?