I think I can live with all the functions dumping their various warnings and such into one file. My question is, if programs (under "my" control) are individually opening the logfile via '>>' for their error logging, what problems might I face? My goal is to interweave my log messages (via warn to a redirected STDERR) with any sort of system log messages.
As an example, the following program seems to "share" the file okay. What might beset me with this approach?
#!/usr/bin/perl -w # show two concurrent processes intermingling their STDERR to the same + file. # Name the program 'test_err.pl' and invoke it from command line with: # test_err.pl hs use strict; my $err_file = 'errfile.txt'; open STDERR, ">> $err_file" or die "cannot open initial $err_file: $!\n"; my $what_to_do = shift; warn "$what_to_do hello world\n"; system( 'test_err.pl', 'abc' ) if $what_to_do eq 'hs'; my $test_string; warn "success\n" if $test_string eq 'xyz'; warn "$what_to_do goodbye world\n"; # Adds the following text to 'errfile.txt': __DATA__ hs hello world abc hello world Use of uninitialized value in string eq at C:\aa\test_err.pl line 19. abc goodbye world Use of uninitialized value in string eq at C:\aa\test_err.pl line 19. hs goodbye world
In reply to multiple programs sharing redirected STDERR by ff
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |