in reply to Safely reading line by line

I'd just impose a memory limit to the perl interpreter process, and die automatically if a line is too long.

Of course that's only possible if you don't mind losing some data from possibly manipulated sources, and don't leave damaged data structures behind (on disk, that is).

Replies are listed 'Best First'.
Re^2: Safely reading line by line
by martin (Friar) on Jun 27, 2007 at 16:49 UTC
    A total memory limit for a process will limit the impact a single failure will have on the rest of the system. This is a reasonable precaution.

    On my Debian GNU/Linux box I can call

    ulimit -v 10000
    in the shell before starting my program and it will no longer be able to use more than 10000 Kilobytes of virtual memory.

    However, that is not all I wanted. I would like to be able to stop processing the input file as soon as its contents are known to be malformed and take whatever evasive action is most appropriate. This would rule out plainly crashing in many cases.