OK, hard question, probably it is not possible, but one never knows...
I have a file on Unix, with varying line lengths, I'm reading one at a time, kinda like tail -f.
I'd like to skip a line if it's longer than a given $MAX length. The problem comes when that line is of the hundreds of megabytes of length. The process dies by memory exhaustion (it's killed by the kernel).
Is there a way for me to skip it without having to seek for the end on line? Like a do_nothing_until_new_line_enters()?
I've tried playing with read(), readline(), and the like, but all of them fail... I haven't tried modifying the thing to use getc(), but the problem is still there, isn't it? How do I know I have a new line on the file, after that monstrous mega-line?
Thanks for your help...
--
our $Perl6 is Fantastic;
In reply to Reading files, skipping very long lines... by Excalibor
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |