Speedfreak has asked for the wisdom of the Perl Monks concerning the following question:

Hej all,

I seem to of got myself if a mess with opening file handles.

I have a text file containing 10 line of text which I want to read into an array, modify the array and then write the file back out.

Basically I want to clobber the file after its been read so that all the data in the file is junked and over written when the array is dumped back out again.

Code looks like this:

open(LOG, "+<$logfile") || die; flock(LOG, LOCK_EX) || die; while (<LOG>) { if (1..10) { chomp; push @entries, $_; } else { last; } } pop @entries; unshift @entries, $log_string foreach $entry (@entries) { print LOG $entry."\n"; } close LOG;

The problem is, its clobbering the file before it gets read into the array! :o(

What I need to do is read the file "as-is" but then when it gets written to, overwrite everything.

I think its the directors I'm using on $logfile in line one

Anyone help me out?

Replies are listed 'Best First'.
Re: Reading a file before clobbering it for output...
by bluto (Curate) on Jun 20, 2001 at 22:47 UTC
    This doesn't truncate the file before it gets read in. You're missing a ';' on your 'unshift' line so this shouldn't even compile. Your code above doesn't replace the contents of the file, it just appends to it (kind of).

    Run perldoc on "truncate" and "seek" to find out how to remove the old content after reading.

Re: Reading a file before clobbering it for output...
by lemming (Priest) on Jun 20, 2001 at 22:43 UTC
    When I ran your code, I had different results that I always had the first ten lines appended to the log file.
    The code below, acts as expected though I need to look into my open statements to make sure it doesn't clobber the lock.
    I replaced the while loop with a for loop and the foreach loop with a simple print join.
    #!/usr/bin/perl use strict; use warnings; my $logfile = "logfile"; my @entries; open(LOG, "<$logfile") || die "Could not open $logfile reading\n"; flock(LOG, LOCK_EX) || die "Could not lock file $!\n"; for (1..10) { chomp($_ = <LOG>); push @entries, $_; } #Do something with @entries # open(LOG, ">$logfile") || die "Could not open $logfile for writing\ +n"; truncate(LOG, 0); print LOG join("\n", @entries), "\n"; close(LOG);
    Update: Instead of the reopen, use truncate with the other open "+<"

    Further Update: Fixed that before bluto's good advice after a bit of looking up some info. Knew the close would happen, wasn't positive it would kill the lock though. Hence, my warning up above. Also means the unlocking flock is unnessasry since the lock is released on close.

      This won't work since opening the file twice in succession drops the file lock starting with the second open. (The file is closed in between, silently).
Re: Reading a file before clobbering it for output...
by kurt1992 (Novice) on Jun 20, 2001 at 22:57 UTC
    I'm a bit confused by the if (1..10) statement, but I think I see what you're trying to do, put the newest log entry at the top of the text file, and limit the text file to 10 lines, correct? this will do the trick ( i was lazy and didn't flock, and the code naievely assumes $logfile exists already, but you get the idea. ) it just uses an array slice in the last foreach loop to limit the number of lines.

    peace.

    #!/usr/bin/perl -w use strict; my $log_string = "warning2 . . ."; my $logfile = "log.txt"; my @entries = (); my $entry; open LOG, "< $logfile" or die "cannot read $logfile"; @entries = <LOG>; map { chomp } @entries, $log_string; unshift @entries, $log_string; close LOG; open OUT, "> $logfile" or die "cannot write to $logfile"; foreach $entry ( @entries[0..9] ) { print OUT $entry."\n"; } close OUT;

      I'm a bit confused by the if (1..10) statement

      This is the bistable operator, take a look at 'range operator' in perlop. Basically in scalar context a..b evaluates to false until condition a is met. Then it evaluates to true until condition b is met. If one of the expressions is constant (as is the case here) an implicit compare with $. (the current line number) takes place.

      -- Hofmator

Re: Reading a file before clobbering it for output...
by CharlesClarkson (Curate) on Jun 21, 2001 at 13:13 UTC

    Since we're printing @entries to a file, we don't need to chomp it and put the \n's back. We can just leave them on.

    # prepend $log_string to @entries my @entries = "$log_string\n"; my $logfile = 'log.txt';
    Writing push @entries, <LOG>; would push the entire log onto @entries.
    Writing push @entries, scalar <LOG>; would push one line of the log onto @entries.
    Adding for 1 .. 10; adds 10 lines of the log to @entries.
    open LOG, $logfile or die "Cannot open $logfile: $!"; push @entries, scalar <LOG> for 1 .. 10; close LOG; open LOG, "> $logfile" or die "Cannot open $logfile: $!"; print LOG @entries; close LOG;

    I tested this on windows. You're on your own with flock.

    HTH,
    Charles K. Clarkson