The questions are "Why are you opening and closing after every file?" and "What do you mean by 'intelligent' logging?". Or more generally, "What problem are you trying to solve?".
Turning off buffering (or turning on auto-flush which amounts to the same thing), for a particular filehandle, can be done easily using select:
select $fh; $| = 1; select STDOUT;
but turning autoflush on/buffering off will slow your IO down in most cases. You'd be better off increasing the the buffering if you are writing large volumes to the same file.
If IO is really slowing your processing substantially--and you need to verify this is the case otherwise your wasting your time--after you avoided constantly reopening the file which if it's the same file is totally unnecessary, then there are some things that you can do to reduce the perl internal overhead which might make a little difference.
If that is still not sufficient for your needs, then you can implement a form of asynchronous IO using a thread and a memory-file filehandle to provide additional buffering, but for that to be effective, you need to be generating very large volumes of data at very high speeds.
If your running on Win32, you could use real asynchronous IO by dropping to the system api level.
But for any of these things to be worth the effort, you have to know that IO is your bottle neck.
In reply to Re: Intelligent logging
by BrowserUk
in thread Intelligent logging
by Muoyo
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |