in reply to Processing large files
G'day Dr Manhattan,
This would be an ideal situation in which to use the built-in module Tie::File. That won't suffer from memory issues due to the size of the input file and would allow you to eliminate the need for the while loop, chomp, push, if condition and $counter. Also, you don't appear to be storing data in @information for subsequent use so you can eliminate that variable and the for loop that processes it. Here's roughly what you'd need:
use strict; use warnings; use autodie; use Tie::File; tie my @input_data, 'Tie::File', 'your_input_filename'; open my $output_fh, '>', 'your_output_filename'; for my $record (@input_data) { my $extracted_info = ...; # extract info from $record here print $output_fh "$extracted_info\n"; } untie @input_data; close $output_fh;
-- Ken
|
|---|