|Problems? Is your data what you think it is?|
Assuming $fh is a handle to a file, File::Util has an interesting idea. It has a "readlimit" method that limits the size of file it will open.
Of course, if your attacker has local access, or you're reading from a socket, that won't save you, since the file could get appended or modified AFTER you've opened it.
Letting the interpreter crash is looking quite tempting :) Of course, that's only an option if it's not going to result in a denial-of-service attack.
I think your idea of writing your own buffering length-limited readline in terms of read or sysread is probably the way to go, but it's going to be mildly complex if you want to make it efficient... Of course, if you do work that out, it'd probably be a nice addition to IO::Handle
A reasonable alternative may be to recast your loop in terms of fixed-length reads, rather than line reads. But for line-oriented data, that's a pain :(
Hmmm, this wasn't a very helpful response, was it? Sorry about that. You've brought up an interesting problem, and I don't know what the right answer is, but hopefully one of these rambles sparks an idea for someone who DOES know.