(using "cat" program instead of perl code for simply transferring large quantities of data)sub consolidate_logs ($$$) { my ($destination_file, $dir, $filename_str) = @_; my @files = get_matching_filenames($dir, $filename_str); open(OUT,"> $destination_file") or die "Could not open file \"$des +tination_file\" for writing"; foreach my $source_file (@files) { print "Processing of log \"$source_file\" started at " . local +time() . "\n"; system("cat $dir/$source_file >> $destination_file"); print "Processing of log \"$source_file\" ended at " . localti +me() . ".\n"; } close(OUT); }
split_logs function could be simplified to thissub consolidate_logs ($$$) { my ($destination_file, $dir, $filename_str) = @_; system("ls $dir | grep $filename_str | xargs -iX cat $dir/X >> $de +stination_file"); }
again - using external program ("grep" this time) for simple string matching but in big quantities of data.sub split_logs ($$$) { my ($source_file, $business_list, $filename_prefix) = @_; foreach my $business (@$business_list) { my ($name, $file) = @$business; my $outfile = "/inside29/urchin/test/newfeed/$filename_prefix- +$file"; print "Creating of log for $name started at " . localtime() . + "\n"; system("grep \"$name\" $source_file >> $outfile"); print "Log for $name created at " . localtime() . "\n"; } }
In reply to Re^2: Reduce the time taken for Huge Log files
by Anonymous Monk
in thread Reduce the time taken for Huge Log files
by pr19939
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |