in reply to cleaning up logs

Effectively, this isn't possible without either stopping (or pausing briefly) the program doing the logging or modifying it so that it would create a new file if you rename the one its using - assuming that is possible under your OS.

What you are asking for is to be able to delete lines from the front of the file whilst allowing them to be written to the end. Whilst I vaguely recall seeing this facility on a mainframe OS once, if your running under Win32 or *nix, I am not aware of any filesystem that allows this type of operation. I'll no doubt be corrected if I got this wrong.

Even if your OS/FS allows you to rename the file out from under the program, which is unlikely unless the program opens/writes/closes the log file each time, you would still end up with only the new lines in the file rather than 30 days worth + new.

Although the perl -i switch mentioned above saves you from explicitly having to open the file, it is still opened . In fact what actually happens is that the file is renamed and then a new file is created with the original name, the renamed file is then read and any lines you choose to print will be written to the new one. This doesn't work if the file is already opened.

It is possible, under Win32 for sure and almost certainly under *nix, to take a copy of an opened file. You could then process this file by archiving the old lines to one file and putting the last 30 days worth in yet another. Whilst you are doing this, the program doing the logging would continue to append to the original logfile. You then have the problem of copying any new lines from the original file to the end of your newly created 30days file, and then pursuading the first program to start using the new one, which is just the same problem again.

The usual method of doing this kind of thing is to have the original program alternate between to log files every day, or every week and then your archiving program would process the currently static file whilst the other is being written. Of course, if the original program does use this technique, it would require modification. If this is possible, it is your only option that I can think of.


Cor! Like yer ring! ... HALO dammit! ... 'Ave it yer way! Hal-lo, Mister la-de-da. ... Like yer ring!

Replies are listed 'Best First'.
Re: Re: cleaning up logs
by gnu@perl (Pilgrim) on Oct 07, 2002 at 19:48 UTC
    Thanks for pointing this out. I didn't think to add the fact that the file is still opened and any data that goes to it after that point will not go through the filter.

    The assumption I made was that the file was written to in a chronological order, so anything that was coming in while the script was running should be under the 30 day limit that was mentioned in the original post. This would make the point that the filter dosen't see this new data moot.

      My point was that using Perl -i on an open file won't work as it will try to rename that file and create a new one with the old name. The rename will fail.


      Cor! Like yer ring! ... HALO dammit! ... 'Ave it yer way! Hal-lo, Mister la-de-da. ... Like yer ring!
        Are you sure it works like that? I tried the example below and it worked fine? Am I missing something?
        perl -e 'open(FILE,"./syslog") or die "Cannot open syslog: $!\n"; slee +p 1000000000;' & then perl -i -p del.pl ./syslog

        This worked fine, next I tried this:

        perl -e 'use Fcntl;open(FILE,"./syslog") or die "Cannot open syslog: $ +!\n";flock(FILE,LOCK_EX); sleep 1000000000;' & then perl -i -p del.pl ./syslog

        A 'ps' both before and after the -i command showed the opening of the local syslog file active. Both with and without the flock worked fine. Please let me know if I am missing or misunderstanding something about the file locking?