in reply to Quickest way to write multiple files

Update: if BrowserUK is correct, I can be safely ignored.

Opening and closing a file for each record is going to get you lousy performance. If the records are sorted by filename, try (untested):

my $openfile = ''; foreach my $file (@file) { my $filename = $file->{filename}; my $content = $file->{content}; if ($filename ne $openfile) { $openfile = $filename; open(FILE, "> $filename") or die "Can't open $filename $!"; } print FILE $content; } close(FILE) if $openfile;
Otherwise, sort by filename (untested):
for my $file (sort {$a->{filename} cmp $b->{filename}} @files) { ... }
You may get slightly better results saving all the records for a file in an array and then printing them together just before opening the next file (or the close at the end).

If you had a smaller number of files, I'd recommend a hash of filehandles instead:

foreach my $file (@files) { my $filename ... my $content ... if (!$fh{filename}) { open $fh{filename}, "> $filename" or die ... } print {$fh{filename}} $content; } foreach my $fh (keys %fh) { close($fh{$fh}); }

Replies are listed 'Best First'.
Re^2: Quickest way to write multiple files
by BrowserUk (Patriarch) on Jun 08, 2004 at 09:51 UTC

    As far as I can tell, his snippet was only opening and closing each file once, not once per line, and he appears to be writing the entire contents in one hit.

    With only 12 lines per small file, unless the lines are quite long then accumulating the data and writing it in one hit is unlikely to benefit as it's quite possible that stdio would have buffered more than the entire size of each file anyway.


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "Think for yourself!" - Abigail