I have written a small script, with a rather specific task, that is it reads various sized files into memory, encodes them and then writes them to a socket. I developed these on a 1.4 GHZ Suse Linux box (2.4 kernel) with 512 mb of ram and everything works fine. When I take these scripts to the client, running AIX on RS6000 I get an out of memory error and the process dies after I read in about 120 mb. These machines have 512+ mb ram and were not even running X at the time. I have tested multiple scripts on these machines as well as my Linux cluster and our Solaris cluster. The only ones that run out of memory are the AIX machines and it always happens around 120mb. Has anyone else ran into this problem?