in reply to delete/over-write record in FlatFile
But back to the question. To "edit" a file like this you basically have to load the entire thing into memory, make your changes, and then write the whole thing back out. And you'll have to lock the file while you're doing it to make sure you don't get two editors working at the same time. The reason it's not possible to edit the file in-place is because the lines are of variable length, so editing means changing the length of the file and the position of lines lower down.
So, you'll start by loading all the data into a structure in memory. Something like (untested):
open (DB,"ext.log") or die("There was a problem opening the file."); my @data; while (my $entry=<DB>) { my @fields = split(/ /,$entry); push @data, \@fields; } close DB;
Now you'll make a change:
$data[5][1] = "New value!";
Then write it all back out again:
open(DB, ">", "ext.log") or die $!; foreach my $row (@data) { my $line = join(' ', @$row); print DB $line, "\n"; } close DB;
If your file is it at all big this is going to totally suck. In that case you might look at using something like Tie::File. That will help with memory usage, but there's no helping the fact that making random edits to a sequential file will be slow.
Now doesn't this make you want to do it right? Good! You should be using a database of some kind to hold the data. If you have a real DB like MySQL or Postgres already available then use that. If not you might look at something like DBD::SQLite to save you the trouble of getting a real DB installed. Then create a table, load your log data into it, and suddenly editing lines will be cake! And who doesn't like cake?
-sam
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: delete/over-write record in FlatFile
by lakeTrout (Scribe) on Mar 25, 2008 at 21:43 UTC | |
by chromatic (Archbishop) on Mar 25, 2008 at 22:50 UTC | |
by samtregar (Abbot) on Mar 25, 2008 at 23:28 UTC |