in reply to A clue!
in thread vanishing system call
Notice the "data seg size" there. If it's less than the total size of your files you may be hitting your limit. For example if it's set at 100MB for you then 3 files at 30MB each will process easily, but a forth file bringing the total up to 120MB will hit your segmentation size (and probably cause a "segmentation fault") and it'll be as if the sort never occured.core file size (blocks) unlimited data seg size (kbytes) 131072 file size (blocks) unlimited max memory size (kbytes) 1019872 open files 4096 pipe size (512 bytes) 8 stack size (kbytes) 2048 cpu time (seconds) unlimited max user processes 64 virtual memory (kbytes) 1048576
If you're very lucky (assuming your system administrators like you etc) you'll be able to raise the size of your data segments with "ulimit -d <larger number here>". You probably won't be able to raise it above the "max memory size" that you've been given though.
Hope that helps.
jarich
|
|---|