Would opening and closing files thousands of times be more of a hit than just 80 or so times?
If generating 5 day graphs @DATA would be approx 288*5*80 lines.
Would it perhaps be better to open all the files first before the loop and then close them all after?
sure, you could open all files first, and store the filehandles in a hash, keyed by servername. depending on the size of the records, you could also process the data in one loop, store it in memory, and write it all out in a second loop. there are many ways to do it. you might try benchmarking a few to see what works best for you.