I have task which is: read live feed from the system which is logs of clients, ever file has about 18K record. we do some process on it and then write every record of specific client to specific file to that client. so we can have file that has every record that specific client has made.
our program can handle the first part which is read the input file and convert it to array of 2xn. which first column is client name and the second is the record.
the problem is: when the program start to write the record to the specific files there really large delay in processing which take more than 2min for each file.
@RECList = ..... clientname,record sam,plaplapla jame,bobobo kate,sososo ..... print "FLASH A-LIST\n"; foreach my $CDR (@RECList){ my ($filename,$row)= split(/,/, $CDR); open(my $csv_fh, ">> /ClientRrecord/$filename.csv") or die "couldn +\'t open [$filename.csv]\n".$!; print { $csv_fh } $row."\n"; }
In reply to best way to fast write to large number of files by Hosen1989
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |