update: Since all the processes are appending their stdout to the same log file, you could leave the ">> log" out of the sub-shell command lines, and run the main script like this (assuming a bourne-like shell that allows separate redirection of stdout and stderr):open SH, "| /bin/sh" or die "can't launch subshell: $!"; for my $file ( @files ) { my $cmd = "perl -Ilib $file >> log\n"; print STDERR $cmd; print SH $cmd; # stderr from subshell will go to STDERR } close SH;
another update: Actually, the for loop shown above will tend to get it's own STDERR output mixed up wrong with any STDERR coming from the subshell. By default, output to SH will be buffered, and you're most likely to see all 20 or so command lines listed first, then if any of them caused problems, their STDERR output will come afterwards. Just imposing autoflush on SH will not solve the problem -- in fact, it could make it worse, since both the subshell and the main perl script could end up writining to STDERR at the same time -- what a mess.$ run-em-all.pl > log 2> errlog
Either don't echo the command lines to STDERR, or else log them this way (again, assuming a bourne-like shell):
This way, the subshell itself prints a message to stderr before running each little script file.$cmd = "echo running $file 1>&2; perl -Ilib $file" print SH $cmd;
In reply to Re: running many small, similar scripts
by graff
in thread running many small, similar scripts
by qq
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |