Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

perlish way to 'tail -100 $logfile'

by gri6507 (Deacon)
on Dec 09, 2005 at 17:38 UTC ( [id://515602]=perlquestion: print w/replies, xml ) Need Help??

gri6507 has asked for the wisdom of the Perl Monks concerning the following question:

Fellow monks,

I have a program that is started when a user logs into the system and stays alive (hopefully) until the user exits. This program occasionally logs some events to $logfile. Recently, I received a request from my boss to make sure that this logfile does not exceed some predefined file size, which, for the sake of this discussion will be 100 lines of the log file. In other words, I need a way to do `tail -100 $logfile > $logfile.tmp; mv $logfile.tmp $logfile` from inside my script. Is there a propper perlish way to do that?

Replies are listed 'Best First'.
Re: perlish way to 'tail -100 $logfile'
by talexb (Chancellor) on Dec 09, 2005 at 17:45 UTC

    I've heard of File::Tail though I haven't used it.

    Alex / talexb / Toronto

    "Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds

Re: perlish way to 'tail -100 $logfile'
by holli (Abbot) on Dec 09, 2005 at 17:44 UTC
Re: perlish way to 'tail -100 $logfile'
by isotope (Deacon) on Dec 09, 2005 at 18:04 UTC
    Does your log need to be a rolling log that always contains the most recent entries, or can it simply be restarted whenever it gets too big? This make a difference in your approach. Your tail example leans more toward the former, while something like logrotate that just periodically clobbers (or archives) the file if it's too large would accomplish the latter and be much simpler.
    Update: To maintain a rolling logfile, if your program is persistent during a login session, you could simply maintain the most recent data internally (using an array of individual log entries, for example, push a new entry and shift the oldest one off), then dump the entire log to file each time it's updated. Not the best for maintaining logfile integrity, but the code would be very clean.


    --isotope
      ideally, I would like to do a rolling log of 100 most recent entries, but I would not want to keep it in the memory because at any time during the customer's login session (during the program execution), they could want to take a look at the log file.
Re: perlish way to 'tail -100 $logfile'
by Roy Johnson (Monsignor) on Dec 09, 2005 at 18:02 UTC
    If it's an option, I'd say rewrite the logging portion of your program to use Tie::File. Then you could just maintain the tied array at 100 elements and your logfile would automagically be what you wanted.

    Caution: Contents may have been coded under pressure.
Re: perlish way to 'tail -100 $logfile'
by ptum (Priest) on Dec 09, 2005 at 17:45 UTC

    It seems to me that you might simply want to evaluate the size of the file using stat or the -s file test operator, then conditionally mv the file based on the return from that.


    No good deed goes unpunished. -- (attributed to) Oscar Wilde

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://515602]
Approved by Old_Gray_Bear
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chilling in the Monastery: (5)
As of 2024-04-19 16:06 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found