in reply to Reading a file before clobbering it for output...

When I ran your code, I had different results that I always had the first ten lines appended to the log file.
The code below, acts as expected though I need to look into my open statements to make sure it doesn't clobber the lock.
I replaced the while loop with a for loop and the foreach loop with a simple print join.
#!/usr/bin/perl use strict; use warnings; my $logfile = "logfile"; my @entries; open(LOG, "<$logfile") || die "Could not open $logfile reading\n"; flock(LOG, LOCK_EX) || die "Could not lock file $!\n"; for (1..10) { chomp($_ = <LOG>); push @entries, $_; } #Do something with @entries # open(LOG, ">$logfile") || die "Could not open $logfile for writing\ +n"; truncate(LOG, 0); print LOG join("\n", @entries), "\n"; close(LOG);
Update: Instead of the reopen, use truncate with the other open "+<"

Further Update: Fixed that before bluto's good advice after a bit of looking up some info. Knew the close would happen, wasn't positive it would kill the lock though. Hence, my warning up above. Also means the unlocking flock is unnessasry since the lock is released on close.

Replies are listed 'Best First'.
Re: Re: Reading a file before clobbering it for output...
by bluto (Curate) on Jun 20, 2001 at 22:50 UTC
    This won't work since opening the file twice in succession drops the file lock starting with the second open. (The file is closed in between, silently).