in reply to Splitting Large File into many small ones.
If you are under any system with the Unix toolset available, the split command already does this and more. If you're bent on using Perl, the Perl Power Tools also have a pure Perl implementation of split.
Personally, I would make the script more flexible by parsing the options via GetOpt::Long and having the input file as the last parameter. Having it as the last parameter allows the script to be used within a shell pipeline like gunzip -c my_file | perl -w partition_file.pl 1000, and it makes the rest of the parameters independent from their position.
perl -MHTTP::Daemon -MHTTP::Response -MLWP::Simple -e ' ; # The $d = new HTTP::Daemon and fork and getprint $d->url and exit;#spider ($c = $d->accept())->get_request(); $c->send_response( new #in the HTTP::Response(200,$_,$_,qq(Just another Perl hacker\n))); ' # web
|
|---|