Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

After much programming and processing, I finally have my data ready for the last step - I want to run it (basically a text file) through a program (call it foo), and write out the processed file.

I have tried to search for ways to do this, but two-way pipes seem fairly complicated. Should I write the file to a temporary file, and use foo to read that file (it can read from a file or STDIN, and always writes to STDOUT without redirection), and grab STDOUT, and write that to a file?

A search seems to tell me that the preferred way is to do something like:

open(PIPE_FROM_FOO, -|);

But I still didn't see how to call foo in the examples (they mostly concerned writing to a pipe).

I have my textfile in a variable called $myfile. Can someone tell me the "best" way (or any good way), to run it through foo?

update (broquaint): title change (was piping)

Replies are listed 'Best First'.
Re: Confusion with two-way pipes
by integral (Hermit) on Feb 18, 2003 at 09:24 UTC
    There is the IPC::Open2 (and the Open3) module in core to make this more painless, but if you're going to write the output of the process to stdout or another file you could use a simple piped open like this:
    open PIPE, "|program >$file"; print PIPE "testing. output will go to $file, since perl will pass th +e command line though the shell\n";

    --
    integral, resident of freenode's #perl
    
Re: Confusion with two-way pipes
by tall_man (Parson) on Feb 18, 2003 at 16:06 UTC
    It's not clear from your question if you want to do additional processing in perl after you get the output from foo. If you get to the point of needing pipes to and from a command, I suggest you look at IPC::Run, which is reputed to be safer than IPC::Open2 or IPC::Open3. Here is some example code adapted from the IPC::Run man page:
    use strict; use IPC::Run qw( start finish ) ; my @cat = qw( cat ) ; # Create pipes for you to read / write (like IPC::Open2 & 3). my $h = start \@cat, '<pipe', \*IN, '>pipe', \*OUT, '2>pipe', \*ERR or die "cat returned $?" ; print IN "some input\n" ; close IN ; print <OUT>, <ERR> ; finish $h ;
Re: Confusion with two-way pipes
by Anonymous Monk on Feb 18, 2003 at 23:32 UTC
    Thank you everyone for your replies.

    Integral, since foo represents the last step before writing the file, your simple solution will work best, right now.

    Tall_man, thank you for the strongest theoretical solution (something I want to learn, because I may need it in the future).

    Clairudjinn, the outside program foo (NOT a perl script) will do the final processing, and I need to get the data to that program.

    Bsb, I will have to do this on hundreds of files, and I saw a number of recommendations against using system calls.
Re: Confusion with two-way pipes
by bsb (Priest) on Feb 18, 2003 at 22:54 UTC
    I tend to just read from STDIN and write to STDOUT then use shell commands, a shell script or a perl 'system' to run it.

    That way you can do it to files and examine each stage during development, or do one huge pipeline, or even run it interactively.

Re: Confusion with two-way pipes
by clairudjinn (Beadle) on Feb 18, 2003 at 22:53 UTC
    I fail to see how a POF (plain old filehandle) doesn't do what you want it to. Basically, you have a file, a script for processing the file, and script output that you want to redirect to another file. Why do you need pipes?

    open IN, "some_file" or die $!; open OUT, ">>some_other_file" or die $!; #generic file processing while (<IN>) { #process... print OUT $result; }
    "Would you fly in a jet whose guidance software was written in Perl?"