dnquark has asked for the wisdom of the Perl Monks concerning the following question:
stalls after about 2,200,000 lines. The memory usage of the script grows, but it appears to just be sitting there stuck... Can anyone shed some light onto what is going on, and hopefully suggest another way I can read large gz files directly from Perl?..my $z = new IO::Uncompress::Gunzip $ARGV[0]; while( <$z> ){ chomp; print "$.:$_\n"; }
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Problems reading large gz files with IO::Uncompress::Gunzip
by NetWallah (Canon) on Dec 30, 2008 at 02:12 UTC | |
|
Re: Problems reading large gz files with IO::Uncompress::Gunzip
by tilly (Archbishop) on Dec 30, 2008 at 03:23 UTC | |
by dnquark (Novice) on Dec 30, 2008 at 04:12 UTC | |
by NetWallah (Canon) on Dec 30, 2008 at 05:10 UTC | |
by DStaal (Chaplain) on Dec 30, 2008 at 13:48 UTC |