in reply to Windows-specific: Limiting output of created files
I don't know of any Windows equivalent of ulimit -f. (ie. per process filesize limits.)
Your limiting tee idea seems the simplest to implement, but be aware that unless the writing process is checking for failing writes, it may not notice the pipe has gone away.
Eg: When the second instance of perl below terminates, the first instance blythly continues to write to the pipe. It doesn't cause any memory growth, but it does use a substantial amount of cpu until it decides to stop writing of it's own accord.
perl -E"say $_ for 1..100e6" | perl -E"while(<>){ $n += length; last if $n > 1024**2; warn qq[$n\n];} +"
If the first instance was checking for the success of it writes, it would notice the pipe going away:
perl -E"say or die $^E for 1 .. 100e6" | perl -E"while(<>){ $n += length; last if $n > 1e5; warn qq[$n\n]; }" ... 99990 99996 The pipe is being closed at -e line 1.
But if you can't change the process that's probably not useful info :(
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Windows-specific: Limiting output of created files
by Anonymous Monk on Apr 03, 2009 at 17:01 UTC | |
by BrowserUk (Patriarch) on Apr 03, 2009 at 17:27 UTC | |
by ikegami (Patriarch) on Apr 03, 2009 at 17:42 UTC | |
by rovf (Priest) on Apr 06, 2009 at 08:10 UTC | |
by BrowserUk (Patriarch) on Apr 06, 2009 at 10:05 UTC | |
by rovf (Priest) on Apr 06, 2009 at 10:31 UTC | |
|