This code:
foreach $uniquekey1 (keys %myHash1) { $uniq2 = $file_name1.'.csv'; sysopen($CircleGroupHandle1,"$ARGV[0]/$uniq2",O_WRONLY| +O_APPEND|O_CREAT)or die "Error writing to $!";
If you look closely the file name is composed of: "$ARGV[0]/$file_name1.csv", all of the variables involved are defined outside the foreach loop, if that doesn't change that means that within the loop you're opening the same file over and over.
The first optimization would be to extra the constants from the loop.< That means that $file_name and the file_handle can be outside the foreach loop. This will prevent your program to open same file multiple times ( one per keys in %myHash ). Let's say %myHash has 10k keys, that means you opening ( without closing ) same file 10k times.
In reply to Re^3: <speed a major issue with below code also loosing data while writing in files>
by bluescreen
in thread <speed a major issue with below code also loosing data while writing in files>
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |