in reply to Loading a part of the file to array using Tie::File
The problem with memory is that you're reading all the lines of the file at once. Rather than doing that, read the file line by line and do your processing. That way you can handle a file of any size. You can do so like this:
# The three-argument version of open, and using lexical variables is p +referred open my $LOG_READ, '<', $InLogFilePath or die "Can't open file $InLogF +ilePath: $!"; # Read the next line into $current_line while (my $current_line = <$LOG_READ>) { # Ignore the first $InStartLineNumber lines of the file next if $. < $InStartLineNumber; # Process current line . . . } close $LOG_READ;
As you'll notice, I intentionally changed the way you did things:
There's no real need to spawn two additional processes to perform tasks that perl can already do for you. Especially not to:
The three-argument form of open is safer than the old two-argument open.
...roboticus
When your only tool is a hammer, all problems look like your thumb.
|
|---|