in reply to file test operations

Lets assume you have a program that takes one hour to run:

somefunction() # takes 1 second to run foreach ( @bigmightarrayof360000 ) { someotherfunction($_); #takes 0,01 seconds to run }

Now if you could throw away some error checking code in somefunction() to have its runtime down to 0,5 seconds, would you do it? Your program would gain an unnoticable speedup and in case of error either silently fail or even produce inconsistent data

There are a few golden rules in optimization:

* Don't optimize when you don't need to, i.e. if the program runs fast enough, don't optimize

* Don't optimize prematurely

* Don't optimize where it doesn't count. Usually say 5% of the code is responsible for 95% of the runtime. If you optimize in the other 95% of the code, you wasted your time for minimal gain

* Optimization is a trade off. You might gain speed, but most of the time you lose on readability, memory usage, maintainability or safety. So be sure you can afford the loss.

You might also like to read the section 'When to optimize' in Optimization_(computer_science)