Update: I got a negative vote for questioning the purpose of the OP's question. My bad. However, if the question is modified to reflect a more realistic scenario, there would be some interesting answers applicable in the real world.
I guess this is a Golf question?
I was working on a solution until I got to step 3 and realized that this requirement makes no sense.
It is so silly that I can't image a real world use for it!
A more real world thing might be having to write a humongous amount of data to a multiple CD data set where no single file can span a CD boundary. I saw this sort of requirement in the olden floppy disk days. When loading say 20 diskettes in a data set, you want to keep going even if diskette #5 has a fatal error. At the end, say 19 diskettes loaded and one didn't. Now we can get one diskette and patch the system with that single diskette in a straightforward way.
I have no idea of a practical use for this requirement.
Here is where I stopped:
BTW, I see no need to parse the .csv file. My gosh, I am unaware of any CSV file that is \n field delimited - what that would mean boggles my mind and would result in some confused display with a text editor.
#!/usr/bin/perl
use strict;
use warnings;
# node: 11104399
use constant DIR => ""; #set these as needed...
use constant N_FILES => 3;
# step 1: List all .csv files in a directory by increasing order of fi
+le size
my @files_by_size = sort ($a -s <=> $b -s}<DIR/*.csv>;
print join (@files_by_size,"\n"),"\n";
# step 2: Drop the first line of each file and concat the rest into a
+single output file
open OUT, '>', "BigFile" or die "...blah..$!";
foreach my $infile (@files_by_size)
{
open my $infile, '<', $infile or die "unable to open $infile $!";
<$infile>; #throw away first line of file
print OUT while <$infile>;
}
close OUT; # $infile already closed...
# step 3:Split the above output file into "n" smaller files
# without breaking up the lines in the input files
#
# This is a strange requirement! A VERY strange requirement!
# the obvious thing to do is to make n-1 "big" files and throw
# what is leftover into the nth file (which will be very small)
#
# The tricky part here is to make sure that at least one line
# winds up in the nth file. Now that I think about it...
#
# geez if n==3 and outsize = total bytes,
# Create file1 and write lines >= total_bytes/2 to it.
# write one line to file 2.
# write the rest of lines to file 3.
my $big_files = $n-1;
# stopped at this point because this sounds like a Golf situation
# with a very contrived situation and I'm not good at that.
#step 4: this is easy