Dear Lauren_R
Thanks for your replay, Its has very interesting ideas (I read your replay more than 5 times ^_^). And yes,that is what you suggested was true.
what's in my mind now (go with first idea) is to load the logs to DB (we will use mySQL - thanks sundialsvc4 for the idea about using DB), and then we will run some groupby query then write the result to the specific files, this will reduce the number of opening and closing.
I think with the right schedule we can handle all the files without any delay.
and we will try the second idea too, because it has also good approach to resolve the issue.
we will compare the two idea and off-curse chose the best ;)
I will update shortly.
BR
Hosen
In reply to Re^2: best way to fast write to large number of files
by Hosen1989
in thread best way to fast write to large number of files
by Hosen1989
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |