Roy.Amitabh has asked for the wisdom of the Perl Monks concerning the following question:

well i am writing a simple code in which the file contents of .txt file are read into an array. Next the contents of this array are copied into another array . When the contents of the new arrray are modified, the changes are also visible in the old array from which the data was copied from. This new array is then written to the same file. As a result, file data duplication is occuring......

Any help will be appreciated. Thanks.

################### Code Snippet########################
#!/usr/bin/perl #************* File Header ****************************** # File Name : nas_add_new_operation_url_logic.cgi # Module : WebInterfaces # Notes : Logic page for writing new Oeration URL along with ac +cess rules # into Rules.txt # History : $Date: 26/01/05 12:34 $ $Revision: 3 $ # All Rights Reserved, Copyright (C) 2004, Hitachi, Ltd. #********************************************************* use CGI qw/:standard/; my @dataWrite; my @dataRead; my $x; undef @dataWrite; undef @dataRead; open (fileReadWrite,"../Rules1.txt"); my @dataRead = <fileReadWrite>; foreach my $value1 (@dataRead) { push @dataWrite , $value1; } push @dataWrite, "Code1\n"; open(fileOUT,">../Rules1.txt") or dienice("Can't open counter.txt: $!" +);; flock(fileOUT,2); seek(fileOUT,0,0); foreach my $value3 (@dataWrite) { print fileOUT $value3; } close(fileOUT); print "Content-type: text/html\n"; print("\n"); ###########################################################

20050201 Janitored by Corion: Added formatting

Replies are listed 'Best First'.
Re: file duplication error
by bart (Canon) on Feb 01, 2005 at 13:16 UTC
    You seem to be just overwriting your original file. That's not my idea of "duplicating a file". If you just happen to run the script twice, it'll append "Code1" twice. That's normal, I would think.

    You're really going through a lot of trouble to copy the contents of an array, where a simple

    @dataWrite = @dataRead;
    would do.

    Nowhere do you actually look at what @dataRead contains, after you modify @dataWrite. How would you know it changed? It didn't, I'm sure.

    Oh, and this:

    open(fileOUT,">../Rules1.txt") or dienice("Can't open counter.txt: $!" +);; flock(fileOUT,2); seek(fileOUT,0,0);
    is a recipe for disaster: first you open the file for writing, clearing it in the process, and then you try to lock it. Too late. You can try replacing ">" with ">>", and add
    truncate fileOUT, 0;
    at the bottom. It might work, it surely would on some platforms.
Re: file duplication error
by borisz (Canon) on Feb 01, 2005 at 13:07 UTC
    You code did not duplicate the data. Propably you did not show the important part. However, you should use strict and use warnings;. Check open for errors and propably use binmode on your filehandles. Here is a cleaned version from your code. Untested... Ups, add the flocks.
    use strict; use warnings; use CGI qw/:standard/; use Fcntl ':flock'; my ( @dataWrite, @dataRead ); open( my $in, "<", "../Rules1.txt" ) or die $!; binmode $in; flock($in, LOCK_SH); @dataWrite = @dataRead = <$in>; flock($in, LOCK_UN); close $in; push @dataWrite, "Code1\n"; open( my $out, ">", "../Rules1.txt" ) or dienice("Can't open counter.t +xt: $!"); binmode $out; flock($out,LOCK_EX); print $out @dataWrite; flock($out,LOCK_UN); close $out; print "Content-type: text/html\n"; print("\n");
    Boris
Re: file duplication error
by holli (Abbot) on Feb 01, 2005 at 13:14 UTC
    All your code does is that it appends a line to the end of the file. This can far easier be written as:
    open(fileOUT,">>../Rules1.txt") or dienice("Can't open counter.txt: $! +");; print fileOUT "Code1\n"; close(fileOUT);

    holli, regexed monk
Re: file duplication error
by blazar (Canon) on Feb 01, 2005 at 14:01 UTC
    #!/usr/bin/perl #************* File Header ******************************
    Well written code should be mostly self documented. Have you considered pods instead? (Note: this is only a minor point!)
    use CGI qw/:standard/; my @dataWrite; my @dataRead; my $x;
    Also
    my (@dataWrite, @dataRead, $x);
    undef @dataWrite; undef @dataRead;
    Really no need for these!
    open (fileReadWrite,"../Rules1.txt"); my @dataRead = <fileReadWrite>;
    Ouch! Didn't you get told to always (yes, always!) check the return value of open()'s?!?

    Also, nowadays it's generally recommended to use lexical FHs and the three-args form of open(). Hence

    open $fileReadWrite, '<', '../Rules1.txt' or die $!;
    But are you sure it's "ReadWrite"? (I'm contending that it's not an especially well chosen name...)
    foreach my $value1 (@dataRead) { push @dataWrite , $value1; }
    Ouch!
    @dataWrite=@dataRead;
    push @dataWrite, "Code1\n";
    So this is really all you wanted to do... hint: '>>'.
    open(fileOUT,">../Rules1.txt") or dienice("Can't open counter.txt: $!");;
    Ditto as above wrt open().

    What is dienice()? It's sensible to post minimal, but still working examples!

    print "Content-type: text/html\n"; print("\n");
    Don't! There's no point in reinventing the wheel and risking to do it wrong, but possibly for educational purposes. So since you're using CGI.pm in the first place let it do it for you!
Re: file duplication error
by vek (Prior) on Feb 02, 2005 at 05:24 UTC

    This really has nothing to do with the problem at hand but I just wanted to add an observation.

    open (fileReadWrite,"../Rules1.txt");
    Always check the return value of open() and react accordingly:
    open(FILEREADWRITE, "../Rules1.txt") || die "Could not open Rules1.txt + - $!\n";
    You are reading the entire file into memory here:
    my @dataRead = <fileReadWrite>;
    Which is ok for small files but might bite you if you try that with huge files.

    -- vek --