in reply to Re: Searching large files a block at a time
in thread Searching large files a block at a time
What I needed was a way to use "while (<data>)" without using the "getline()" method that seemed to be reading the data in one line at a time. My LDIF is over 15 million lines, so that was quite slow. Using your code, I get results in ~10 seconds, which is acceptable (though still a lot slower than the shell script that pipes into Perl, and I'm not sure why that is). Thanks to you and Roboticus for steering me in the right direction. Cheers, JW.my $z = IO::Uncompress::Bunzip2::->new($filename); while (<$z>) { }
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^3: Searching large files a block at a time
by kcott (Archbishop) on Aug 02, 2017 at 06:52 UTC | |
by JediWombat (Novice) on Aug 03, 2017 at 23:56 UTC | |
by marioroy (Prior) on Aug 04, 2017 at 04:41 UTC | |
by marioroy (Prior) on Aug 04, 2017 at 15:46 UTC | |
by marioroy (Prior) on Aug 05, 2017 at 00:23 UTC |