in reply to How to view and filter logs in a database

If the volume of log files is bad, the volume in a database might be even worse of a problem. A scanner, maybe hooked to logrotate, could comb through the files and pick-out the meaningful messages. Then, gzip the files for a couple of days and then deep-six them. The script that gathers up the log messages could put them into whatever database table or tables you want, and the tables in question would be indexed appropriately.
  • Comment on Re: How to view and filter logs in a database

Replies are listed 'Best First'.
Re^2: How to view and filter logs in a database
by chrestomanci (Priest) on Oct 29, 2012 at 14:53 UTC

    Fair point about the volume of the data, though I am hoping that logging to a database will help solve that.

    At the moment I am logging quite verbosely, as the verbose logs are needed when working out why something crashed, but I am also keeping the log files for about two months, for statistics and reporting. (Which does not need the vebose messages)

    My plan with logging to a database, is that I can log verbosely, and then purge all the verbose messages after a couple of days (unless something crashes), to leave behind a much smaller volume of non verbose messages, that will be kept for the standard 2 month window.

    This would rely on using a database that will efficently compact and re-use space from deleted records. I have allready had bad experences in that area with MySQL. Postgres or Couch look to be promising.