in reply to search/replace very large file w/o linebreaks
One thing you could try is to replace the string Perl uses to identify the ends of lines (input record separator) with one of the tags in your file.
{ local $/ = '<tag>'; while (<>) { # do stuff } }
Or you could just feed a while loop chunks of data at a time using a trick described in perldoc perlvar:
Setting $/ to a reference to an integer, scalar containing an integer, or scalar that's convertible to an integer will attempt to read records instead of lines, with the maximum record size being the referenced integer. So this: local $/ = \32768; # or \"32768", or \$var_containing_32768 open my $fh, $myfile or die $!; local $_ = <$fh>; will read a record of no more than 32768 bytes from FILE. If you're not reading from a record-oriented file (or your OS doesn't have record-oriented files), then you'll likely get a full chunk of data with every read. If a record is larger than the record size you've set, you'll get the record back in pieces.
Update: Commented out irrelevant text from perlvar.
--
Allolex
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: search/replace very large file w/o linebreaks
by ysth (Canon) on Jan 08, 2004 at 17:39 UTC | |
by nothingmuch (Priest) on Jan 09, 2004 at 11:37 UTC | |
by borisz (Canon) on Jan 09, 2004 at 11:48 UTC | |
by nothingmuch (Priest) on Jan 09, 2004 at 11:51 UTC | |
by ysth (Canon) on Jan 11, 2004 at 03:13 UTC | |
by nothingmuch (Priest) on Jan 11, 2004 at 13:22 UTC | |
|