Speeding up logging by caching is all great until you need your log to find/trace a mysterious application crash - and the most recent 100 entries in the log were lost in the crash.
If you can find a way to isolate your logging from the application far enough that a crash doesn't impact it, like writing your own logging daemon or using syslog or whatever, I would look at using that approach.
If you don't really deal with application crashes since the logs are more for application auditing and the application itself is usually stable, then caching things before writing it to disk might be a viable approach.
Personally, I would look at the pragmata that SQLite itself offers, for example pragma synchronous=off, which gives up some of the durability for improved performance.
In reply to Re: Log::Log4perl::Appender::DBI and SQLite
by Corion
in thread Log::Log4perl::Appender::DBI and SQLite
by clueless newbie
For: | Use: | ||
& | & | ||
< | < | ||
> | > | ||
[ | [ | ||
] | ] |