in reply to Splitting long file
In the spirit of TIMTOWTDI and the fact that I have been having a blast using Tie::File lately here is another solution.
This is a very small test set I realize but wasn't really ambitious enough to create a file by hand as large as you were talking about.Tom data data data data data $ Dick data data data data data $ Harry data data data data data $
#!/usr/bin/perl -w ############################# use strict; use warnings; use Tie::File; my @ry=(); # This will be tied my $OLDIFS=$/; # save the IFS $/="\$\n"; # now change it to suit our needs # # Rope tie and brand 'em! YEEE-HAW! tie @ry,"Tie::File","datFile.txt" or die $!; foreach my $rec(@ry){ #iterate through the records chomp $rec; # Eliminate IFS next unless $rec; # Eliminate spew about blank records # if they happen my $fname=(split(/\n/,$rec))[0]; # Get the name next unless $fname; # ho hum. Sometimes you're the # windshield, sometimes you're # the bug. open FOUT,"> $fname" or die "$fname:$!"; # Open file print FOUT $rec; #store record close FOUT; # done with this } untie @ry; # cut them thar ropes!
The potential issue with this solution is I've been told (and I can't verify this one way or another) is that Tie::File is memory greedy and will slurp the whole file into memory in one gulp. If you have a memory starved box that you are running this script on and the assertion of memory use is true then you might have an issue. You mention having memory issues in your OP and I don't know how starved for memory the box you are running this on is..
You didn't specifiy if you wanted to preserve the dollar sign record separator or not but if you eliminate the chomp you will accomplish that as well.
HTH!
|
|---|