As logs tend to be sorted already, it's likely you can avoid the sort as the only part that's likely to be a problem memory-wise
They're sorted by the time that they finish, but the time logged is when the request was made. ... so, a long running CGI or request to transfer a large file at the end of dayN might be after other lines for dayN+1
But you still don't have to sort the whole file, as you can get everything in order, then in a second pass you sum up the values that got split up
In reply to Re^2: Working with a very large log file (parsing data out)
by jhourcle
in thread Working with a very large log file (parsing data out)
by calebcall
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |