Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hello all,
My script calls another utility on a different user of my machine(Linux OS) and dumps data.
The structure of data dumped is some what like below:
Inside Output folder there are 500+ folders which have one file called as "A.rpt" along with some other non-required files. The name of this file (A.rpt) remains exactly same in all the 500 folders. I need to collate the content of A.rpt(content is different in each A.rpt in different folder)into one big file and later use this big file for parsing.
Need help urgently!

Example:
Output/DirA/A.rpt
Output/DirB/A.rpt
Output/DirC/A.rpt
and so on...

  • Comment on Concatenating content of many files from different directories into one big file

Replies are listed 'Best First'.
Re: Concatenating content of many files from different directories into one big file
by toolic (Bishop) on Jun 28, 2012 at 18:02 UTC
Re: Concatenating content of many files from different directories into one big file
by zentara (Cardinal) on Jun 28, 2012 at 20:04 UTC
    Here is a moderately tested File::Find example
    #!/usr/bin/perl use warnings; use strict; use File::Find; # get dir names from command line or use default of current dir my @dirs = @ARGV ? @ARGV : '.'; my $big_file = '/home/zentara/1down/5/big_file'; open OUT, '>', $big_file or die "Cannot open $big_file: $!"; find( sub { return unless -f; return unless $_ =~ /^A\.rpt$/; open RPT, '<', $_ or die "Cannot open $File::Find::name: $!"; print OUT <RPT>; print OUT "\n"; }, @dirs ); __END__

    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku ................... flash japh