in reply to Re: Piping many individual files into a single perl script
in thread Piping many individual files into a single perl script
I personally believe that as a very minor side note, since the OP mentions something like "10000 files," some shells do have problem with a large amount of files. Except that I can't remember how large is large. But I have seen error messages for something like "command line too long" occasionally. In that case, adopting the same technique as for Windows, would be a cure...
use File::Glob qw( bsd_glob ); @ARGV = map bsd_glob($_), @ARGV;
This sounds very wrong [Update: but it is right, see my reply to ikegami's comment] unless one has a good reason to do so: since we're on Windows, we most probably want DOS/Windows-like globbing and glob is dwimmy enough to select its own correct "incarnation:" in all of my scripts that may want globbing, written for Windows or "ported" (what a big word!) there from *NIX, I include the same code as BrowserUk's. Sometimes, (depending on how "important" the app will be...) I also provide "standard" -i and -o cli switches for input and output, since shell redirection has some very little but not null deficiencies.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^3: Piping many individual files into a single perl script
by ikegami (Patriarch) on Sep 29, 2008 at 14:07 UTC | |
by blazar (Canon) on Sep 30, 2008 at 10:14 UTC | |
by ikegami (Patriarch) on Sep 30, 2008 at 15:47 UTC | |
by blazar (Canon) on Oct 11, 2008 at 19:47 UTC | |
|
Re^3: Piping many individual files into a single perl script
by Anonymous Monk on Sep 29, 2008 at 14:11 UTC |