I've considered the approach of using an SQLite database as well. It makes formulating the statistics I want to collect much easier and of course extending is is just a matter of coming up with the right SQL instead of having to muck around with the counters manually. But the thing that kept me away from this so far is that I shy away from storing all the data in memory and that more or less a full table scan will need to be done every second. Of course, I could cheat here and only update the seconds-resolution statistics every second and update (say) the minute-resolution statistics every five seconds...
It seems that I'll have to benchmark SQLite and its memory/disk requirements and compare them to the bytestring version.
In reply to Re^2: Data structure for statistics
by Corion
in thread Data structure for statistics
by Corion
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |