in reply to Re: <speed a major issue with below code also loosing data while writing in files>
in thread <speed a major issue with below code also loosing data while writing in files>

dear monk, thnx for your reply,but my major concern is that i am using file name from the file data itself which is a uniq key for me to lookup however cdr is the content of that file which also contain file name hidden in it. help will be highly app.as i am dealing with a source file having 1+ million records in that.
  • Comment on Re^2: <speed a major issue with below code also loosing data while writing in files>

Replies are listed 'Best First'.
Re^3: <speed a major issue with below code also loosing data while writing in files>
by bluescreen (Friar) on Jul 26, 2011 at 12:56 UTC

    This code:

    foreach $uniquekey1 (keys %myHash1) { $uniq2 = $file_name1.'.csv'; sysopen($CircleGroupHandle1,"$ARGV[0]/$uniq2",O_WRONLY| +O_APPEND|O_CREAT)or die "Error writing to $!";

    If you look closely the file name is composed of: "$ARGV[0]/$file_name1.csv", all of the variables involved are defined outside the foreach loop, if that doesn't change that means that within the loop you're opening the same file over and over.

    The first optimization would be to extra the constants from the loop.< That means that $file_name and the file_handle can be outside the foreach loop. This will prevent your program to open same file multiple times ( one per keys in %myHash ). Let's say %myHash has 10k keys, that means you opening ( without closing ) same file 10k times.

      here u will get the idea wht i am trying to do file name: abc_def_hij.csv data on any line comma seperated as: aaa,abc,aaa,def,aaa,hij…….. etc data on another line comma seperated: xxx,abc,xxx,def,xxx,hij..... etc. now another source file may contain data as coma seperated as : bbb,abc,bbb,def,bbb,hij .... etc so the data from another file should got appended in abc_def_hij.csv file. i will be g8t full if u solve this another thing if i exclude strict & warning every thing is working fine but it is taking app 5min to read 1 million records to create such files from it

        Can you post your new code.

        Also you said it takes 5 minutes to process 1M records, so what is your expected mark?