I'm using below cmd to get data from remote server using SSH in perl :
my $o = `ssh -n -x -q -o StrictHostKeyChecking=no -o BatchMode=yes - +o ConnectTimeout=5 host /script 2>/dev/null`; <code> <p> then, I process the $o and print to STDOUT . Now, the problem here is, when the data returned by the ssh is huge, t +hen, the perl program hangs forever. I did truss of the process id, it shows output like </p> <c> read(3, 0x001DF90C, 5120) = 5120 ---chunk of data from remote host --- brk(0x001E2470) = 0 brk(0x001E2470) = 0 brk(0x001E4470) = 0 read(3, 0x001DF90C, 5120) = 5120 ---chunk of data from remote host --- ---above continues until all data from rhost is read --- ---then starts write operation : write(4, 0x00229D64, 5120) = 5120 ---writes chunk of data read from rhost ---does for some more chunks then hangs forever--- write(4, 0x00229D64, 5120) (sleeping...)
I also found out that the ssh process created by the perl program is not hanging.
Is there problem in storing huge data into a variable (memory) in perl ?
Here's the ulimit of the system :
time(seconds) unlimited file(blocks) unlimited data(kbytes) unlimited stack(kbytes) 8192 coredump(blocks) unlimited nofiles(descriptors) 4096 vmemory(kbytes) unlimited
Need your help in sorting the problem.
In reply to perl hangs on write syscall by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |