Hello all,
My script calls another utility on a different user of my machine(Linux OS) and dumps data.
The structure of data dumped is some what like below:
Inside Output folder there are 500+ folders which have one file called as "A.rpt" along with some other non-required files. The name of this file (A.rpt) remains exactly same in all the 500 folders. I need to collate the content of A.rpt(content is different in each A.rpt in different folder)into one big file and later use this big file for parsing.
Need help urgently!
Example:
Output/DirA/A.rpt
Output/DirB/A.rpt
Output/DirC/A.rpt
and so on...
In reply to Concatenating content of many files from different directories into one big file by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |