Many of the logging modules, such as Log4Perl, include provisions to rotate logs when they reach a certain size.
You say running the script for long periods results in large log files ... for what value of "long period"? If it's several weeks, normal log module/rotatelog practice of rotating logs daily and keeping the last four or five days, will be quite sufficient and more economical. If "long periods" means more than ten minutes, perhaps you should re-evaluate what is logged, and what is not. Obviously you need different levels of logging when things are runnning well, and when you are tracking down a bug; I suggest using a logging module.
Update: Have the server log whether each URL was successfully processed or not. Then you can go back and run the script manually with mega- mega- mega-MEGA debugging detail on the problem URL ... while the ordinary instances generate a few MB. You might have problem URLs saved to a special file, or the script could send you email with the details.
--
TTTATCGGTCGTTATATAGATGTTTGCA
In reply to Re: Controlling file size when writing
by TomDLux
in thread Controlling file size when writing
by Vautrin
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |