lnin has asked for the wisdom of the Perl Monks concerning the following question:
Code which i have written is as follows. Please advise, if there is any better way of writing, that takes less time
my $MergedFileName="MergeOutput.txt"; # Read the logs file into an array, with .txt extention sorted with ti +mestamp existing in the filename opendir(DIR, '.') or die "Input file not avialable Error #: .$!"; my @filesRead = sort(grep(/\.txt$/,readdir(DIR))); closedir(DIR); # Open the Output file open(MAINOUTPUT,">MergedFileName") || warn "Can't open file\n"; # Start merging file into main file by reading each file, line by line FILE: foreach (@filesRead) { open(FILE, $_) || ((warn "Can't open file $_\n"),next FILE); while (<FILE>) { print MAINOUTPUT $_; } close(FILE); } close(MAINOUTPUT); # Merging of the files is done
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: How to merge Huge log files (each 10 MB) into a single file
by BrowserUk (Patriarch) on Sep 03, 2009 at 14:26 UTC | |
by lnin (Initiate) on Sep 04, 2009 at 05:29 UTC | |
|
Re: How to merge Huge log files (each 10 MB) into a single file
by Anonymous Monk on Sep 03, 2009 at 14:07 UTC | |
|
Re: How to merge Huge log files (each 10 MB) into a single file
by lostjimmy (Chaplain) on Sep 03, 2009 at 14:12 UTC | |
by lnin (Initiate) on Sep 03, 2009 at 14:23 UTC | |
by ikegami (Patriarch) on Sep 03, 2009 at 14:26 UTC | |
by SuicideJunkie (Vicar) on Sep 03, 2009 at 14:48 UTC | |
|
Re: How to merge Huge log files (each 10 MB) into a single file
by SuicideJunkie (Vicar) on Sep 03, 2009 at 14:09 UTC |