in reply to splitting an input stream
Update: Corion ++ pointed out that I reversed the sense of your problem, sorry for my confusion.
Instead, have your wrapper script read 1000 lines from foo, put them in a temporary file and then invoke X. Repeat till done.
#! perl -sw use strict; open FOO, '<', 'foo' or die $!; while( not eof( FOO ) ) { my @lines; push @lines, scalar <FOO> for 1..1000; open TEMPFOO, '>tempfoo' or die $!; print TEMPFOO @lines; close TEMPFOO; qx[ x --filename:tempfoo -firstline:1 -lastline:1000 ]; } close FOO;
Instead of having X supply you 1000 lines at a time, ask for them all and pipe the result to your perl script. In your perl script, read 1000 lines from the pipe into an array, process them, then loop back and get the next 1000 lines. Repeat till done.
The output of X will be blocked while your script processes each batch of 1000 lines. Your script will only ever have to hold 1000 lines in memory at a time. X will never have to backtrack or skip over any lines.
A silly example. (One liner wrapped for display)
perl -ne"print" junk | perl -le" while( not eof(STDIN) ) { $,=' '; push @a, scalar <> for 1..10; chomp @a; print reverse @a; @a=(); }" #Outputs 10 9 8 7 6 5 4 3 2 1 20 19 18 17 16 15 14 13 12 11 30 29 28 27 26 25 24 23 22 21 40 39 38 37 36 35 34 33 32 31 50 49 48 47 46 45 44 43 42 41 60 59 58 57 56 55 54 53 52 51 70 69 68 67 66 65 64 63 62 61 80 79 78 77 76 75 74 73 72 71 90 89 88 87 86 85 84 83 82 81 100 99 98 97 96 95 94 93 92 91 110 109 108 107 106 105 104 103 102 101 ...
The first instance of perl just reads the file junk (one integer per line), and prints it to stdout. The second instance, loops, reading 10 lines, chomping them, reversing them and printing them before emptying the array and going back for the next 10.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: splitting an input stream
by l3nz (Friar) on Nov 07, 2003 at 15:50 UTC |