in reply to Re^2: Pipe Problem
in thread Pipe Problem

Sorry. I assumed that you did not have the source to the C program. If you can modify the source, you should be able to call the C function flush() (or fflush()) on stdout/stderr after printing each progress message.


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
"Too many [] have been sedated by an oppressive environment of political correctness and risk aversion."

Replies are listed 'Best First'.
Re^4: Pipe Problem
by Anonymous Monk on Jul 23, 2007 at 01:22 UTC
    No, the C program is working correctly. If I run it from the command line without calling it from PERL, it see the percent completed being printed to the screen as the C program progressed. +::010 (some times later) +::020 +::030 +::040 +::050 +::060 +::070 +::080 +::090 +::100 However, when I run the Perl script and invoking the C program from withing PERL, all the above lines get printed out at the same time after the C program are done. I can monitor this because I have another terminal connected to the database where the C program are sending requests and only when the C program have finished all its requests and disconnected after a few hours do I see all the above lines get printed to the screen all at once.
      That would be because when you run your executable connected to terminal (STDOUT) it doesn't buffer...well it's actually line buffered. but when you run it and connect it to a pipe, then it's buffering differently (probably 1024 bytes or such but don't know for sure.) If you force flush as outlined, the problem will disappear. The problem is introduced by the OS handling of pipes..not the C program (executable) neither perl is at fault.
      the hardest line to type correctly is: stty erase ^H
        The problem is introduced by the OS handling of pipes.

        I've seen this said here before--and I've even believed it--but it's not the case. Take the following example (wrapped for posting):

        perl -e"$Win32::Sleep(100) and print qq[$_ ...] for 1 .. 100" | perl -ne"print qq[\r'$_']"

        One perl process piped to another by the OS. The attempt is to reflect the progress indicator produced by the first process using the second. Needless to say, it doesn't work and there is nothing for 10 seconds, then like London busses, all the output comes along at once.

        But, where are the delays/buffering?

        If you try the first process on it's own

        perl -e"Win32::Sleep(100) and print qq[$_ ...] for 1 .. 100"

        You'll again wait 10 seconds before seeing anything, and then get it all at once. So one level of buffering is in the originating process. Easily fixed using $|++.

        perl -e"$|++; Win32::Sleep(100) and print qq[$_ ...] for 1 .. 100"

        And sure enough, we see new output appear every 1/10th of a second for 10 seconds.

        So, now try to read and reflect that via a pipe:

        perl -e"$|++; Win32::Sleep(100) and print qq[$_ ...] for 1 .. 100" | perl -ne"print qq[\r'$_']"

        And once again, nothing, nothing, nothing, Bang! Must be the pipe that's the problem right?

        Well no. The second perl process is using the default $INPUT_LINE_SEPARATOR, so it doesn't give the perl program any input until it sees a newline. But the first process doesn't produce any newlines. Ever. So the second doesn't give the program any input until it see EOF.

        So how about we set the $IRS for the second process:

        perl -e"$|++; Win32::Sleep(100) and print qq[$_ ...] for 1 .. 100" | perl -e"BEGIN{ $/='...'}" -ne"print qq[\r'$_']"

        Same deal. All the output comes in one lump after 10 seconds. It has to be an OS problem right!

        But still no. The second perl process is still buffering its output. We are still not adding any newlines anywhere, so it again waits for us to tell it when to flush the output.

        So, disable stdout buffering on the second process also:

        perl -e"$|++; Win32::Sleep(100) and print qq[$_ ...] for 1 .. 100" | perl -e"BEGIN{ $|++; $/='...'}" -ne"print qq[\r'$_']" '100 ...'

        And sure enough, this time we get the effect we are after. The progress indicator rolls over once every 1/10th of a second, overlaid in-place because the second Perl process added \rs.

        Note. The piping, as performed by the OS didn't change. All the changes are with the originating and receiving processes. In this case Perl. If they are are set up to communicate directly, without buffering, then they can.

        What this shows is that any output from the first process is immediately available to the second. All the buffering is occuring with the processes themselves, not within the OS/pipe. The problem is not in the OS piping, but within the defaults set by the Perl processes.

        It may well be that under other OSs, the OS disables buffering on stdin/stdout within the processes it starts, when it connects them via pipes. If PerlIO chose to emulate that, when it is printing output to, or receiving input from a pipe, it could.


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.
        Thanks. That probably explains it. So the work around is to have the C program prints out 1024 bytes or so on each line. After all these times, I am surprised that no one has found a better solution for this.