UrbanHick has asked for the wisdom of the Perl Monks concerning the following question:

Greetings again all

This guy who used to work at this office wrote a complex program that renders employee information into 26 seperate raw text files : staffinfo01 to staffinfo26, (They each contain employees with the same first letter of their last names, with each person's information on one line.) I am looking for a way of combining them all into a single file so I can work with the whole list all at once.

Does Perl provide a simple way of doing this or can it be done better thru a UNIX function?

Thank you all,

-UH

Replies are listed 'Best First'.
Re: Combining files
by grep (Monsignor) on Oct 23, 2006 at 03:07 UTC
    I would use the unix shell command cat
    cat staffinfo[0-9][0-9] > staffinfo_full

    but if you really wanted to use perl
    perl -pe '' staffinfo[0-9][0-9] > staffinfo_full



    grep
    One dead unjugged rabbit fish later
Re: Combining files
by rafl (Friar) on Oct 23, 2006 at 03:13 UTC

    If you don't run on a *NIX platform or you don't have cat available for some other reason then something like this would do the job as well:

    use IO::File; my $output_fh = IO::File->new( 'fullstaffinfo', 'w' ) or die $!; for my $input (<staffinfo*>) { my $input_fh = IO::File->new( $input, 'r' ) or die $!; $output_fh->print(do { local $/ = undef; <$input_fh> }); $input_fh->close; } $output_fh->close;

    Cheers, Flo

      I don't think there's any guarantee of the order in which <staffinfo*> will return files. Adding sort avoids the problem. (I prefer to use glob explicitely over using the highly overloaded <...> notation.)

      for my input (sort glob 'staffinfo*') { ... }

      This looks like just the ticket.

      -Thanks!

      -UH