Speedfreak has asked for the wisdom of the Perl Monks concerning the following question:
Hej all,
I seem to of got myself if a mess with opening file handles.
I have a text file containing 10 line of text which I want to read into an array, modify the array and then write the file back out.
Basically I want to clobber the file after its been read so that all the data in the file is junked and over written when the array is dumped back out again.
Code looks like this:
open(LOG, "+<$logfile") || die; flock(LOG, LOCK_EX) || die; while (<LOG>) { if (1..10) { chomp; push @entries, $_; } else { last; } } pop @entries; unshift @entries, $log_string foreach $entry (@entries) { print LOG $entry."\n"; } close LOG;
The problem is, its clobbering the file before it gets read into the array! :o(
What I need to do is read the file "as-is" but then when it gets written to, overwrite everything.
I think its the directors I'm using on $logfile in line one
Anyone help me out?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Reading a file before clobbering it for output...
by bluto (Curate) on Jun 20, 2001 at 22:47 UTC | |
|
Re: Reading a file before clobbering it for output...
by lemming (Priest) on Jun 20, 2001 at 22:43 UTC | |
by bluto (Curate) on Jun 20, 2001 at 22:50 UTC | |
|
Re: Reading a file before clobbering it for output...
by kurt1992 (Novice) on Jun 20, 2001 at 22:57 UTC | |
by Hofmator (Curate) on Jun 21, 2001 at 14:38 UTC | |
|
Re: Reading a file before clobbering it for output...
by CharlesClarkson (Curate) on Jun 21, 2001 at 13:13 UTC |