Talking about quotas... Have you ensured that you are allowed to use that much memory?
Make sure that you're not exceeding your segment size. To determine this type "ulimit -a" at your solaris prompt. You'll get something like:
core file size (blocks) unlimited
data seg size (kbytes) 131072
file size (blocks) unlimited
max memory size (kbytes) 1019872
open files 4096
pipe size (512 bytes) 8
stack size (kbytes) 2048
cpu time (seconds) unlimited
max user processes 64
virtual memory (kbytes) 1048576
Notice the "data seg size" there. If it's less than the total size of your files you may be hitting your limit. For example if it's set at 100MB for you then 3 files at 30MB each will process easily, but a forth file bringing the total up to 120MB will hit your segmentation size (and probably cause a "segmentation fault") and it'll be as if the sort never occured.
If you're very lucky (assuming your system administrators like you etc) you'll be able to raise the size of your data segments with "ulimit -d <larger number here>". You probably won't be able to raise it above the "max memory size" that you've been given though.
Hope that helps.
jarich |