For your current problem, as everyone else said "precompute" is probably the best answer.
You could probably do something complicated to map/reduce the statistics gathering in parallel across sections of the file, but it's probably not worth it.
In reply to Re: Dealing with huge log files 100 MB to 1 GB
by RMGir
in thread Dealing with huge log files 100 MB to 1 GB
by gchitte
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |