in reply to <speed a major issue with below code also loosing data while writing in files>

You seem to be opening the output files multiple times in the foreach loop at the end, that will not only affect the running time but also can be the source of your data lost issue. So open the file before looping:

open my $fh, '>>', $filename; foreach my $uniquekey ( keys %myHash ) { ... print $fh ...; }; close($fh);

If this is a one time script then don't worry *much* about the following but if you have to maintain it over time consider the following:

Replies are listed 'Best First'.
Re^2: <speed a major issue with below code also loosing data while writing in files>
by Anonymous Monk on Jul 26, 2011 at 02:48 UTC
    dear monk, thnx for your reply,but my major concern is that i am using file name from the file data itself which is a uniq key for me to lookup however cdr is the content of that file which also contain file name hidden in it. help will be highly app.as i am dealing with a source file having 1+ million records in that.

      This code:

      foreach $uniquekey1 (keys %myHash1) { $uniq2 = $file_name1.'.csv'; sysopen($CircleGroupHandle1,"$ARGV[0]/$uniq2",O_WRONLY| +O_APPEND|O_CREAT)or die "Error writing to $!";

      If you look closely the file name is composed of: "$ARGV[0]/$file_name1.csv", all of the variables involved are defined outside the foreach loop, if that doesn't change that means that within the loop you're opening the same file over and over.

      The first optimization would be to extra the constants from the loop.< That means that $file_name and the file_handle can be outside the foreach loop. This will prevent your program to open same file multiple times ( one per keys in %myHash ). Let's say %myHash has 10k keys, that means you opening ( without closing ) same file 10k times.

        here u will get the idea wht i am trying to do file name: abc_def_hij.csv data on any line comma seperated as: aaa,abc,aaa,def,aaa,hij…….. etc data on another line comma seperated: xxx,abc,xxx,def,xxx,hij..... etc. now another source file may contain data as coma seperated as : bbb,abc,bbb,def,bbb,hij .... etc so the data from another file should got appended in abc_def_hij.csv file. i will be g8t full if u solve this another thing if i exclude strict & warning every thing is working fine but it is taking app 5min to read 1 million records to create such files from it