in reply to About Filehandles

It all depends whether the file is that big, that it doesn´t fit into your computers memory (then I have a slow and ugly sollution) or if it does (fast to write and neat):

Warning! code not verified and is only a concept

my @stock; open FILE, "stockfile"; while(<FILE>) { push @stock,$_; } close FILE; while(@stock) { foreach $line (@stock) { @items = split ´|´, $line; open $item[2], ">>some_filename"; put_items_into_it; close $item[2] delete_that_line_from_stock; } }
Ciao

Replies are listed 'Best First'.
Re: Re: About Filehandles
by Sifmole (Chaplain) on Jun 25, 2001 at 16:23 UTC
    while(<FILE>) { push @stock,$_; } # Would be better as... @stock = <FILE>;
    Why wrap a while around a foreach here?
    while(@stock) { foreach $line (@stock) {
    A straight foreach will do just as well. Of course in that case there is no need for @stock at all...
    while (my $line = <FILE>) { # do stuff }
    Additionally this will not be memory intensive because it will only store one line in memory at a time. If you are attempting to keep only one filehandle open at a time via...
    open $item[2], ">>some_filename"; put_items_into_it; close $item[2]
    You might be better off using checking if the next item to be written out goes into the same FILE that is already opened. No reason to close it and open it again immediately.
    my $cat = undef; while (my $line = <FILE>) { @items = split('|', $line); if ($cat ne $items[2]) { close OUTFILE if (defined $cat); open OUTFILE, ">>some_filename"; $cat = $items[2]; } print OUTFILE ## Whatever ###; } close OUTFILE;
    delete_that_line_from_stock;
    Is not neccessary, since there is no @stock anymore.