Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical
 
PerlMonks  

Re: Safely reading line by line

by moritz (Cardinal)
on Jun 27, 2007 at 10:51 UTC ( [id://623581]=note: print w/replies, xml ) Need Help??


in reply to Safely reading line by line

I'd just impose a memory limit to the perl interpreter process, and die automatically if a line is too long.

Of course that's only possible if you don't mind losing some data from possibly manipulated sources, and don't leave damaged data structures behind (on disk, that is).

Replies are listed 'Best First'.
Re^2: Safely reading line by line
by martin (Friar) on Jun 27, 2007 at 16:49 UTC
    A total memory limit for a process will limit the impact a single failure will have on the rest of the system. This is a reasonable precaution.

    On my Debian GNU/Linux box I can call

    ulimit -v 10000
    in the shell before starting my program and it will no longer be able to use more than 10000 Kilobytes of virtual memory.

    However, that is not all I wanted. I would like to be able to stop processing the input file as soon as its contents are known to be malformed and take whatever evasive action is most appropriate. This would rule out plainly crashing in many cases.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://623581]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others scrutinizing the Monastery: (1)
As of 2024-04-25 04:35 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found