in reply to when do you stop writing test?

I advocate prevention rather than cure, that is to say, spend sufficient time on code design to limit the complexity of testing and equally to make it a doddle to maintain. In my experience even the most daunting of requirements can and should be reduced to the simplest technical design that does the job.

Or to quote CJ Date: An introduction to database systems, a computer system should reflect the simplest model capable of supporting the data rather than an effort to model the real world.

Update: But, if you've arrived at the end of that road, willingly or not: the most popular standard for a "complete" test set seems to be the set of functionally unique (but may be arbitrarily chosen) permutative cases for each requirement specified for the system.

-M

Free your mind

Replies are listed 'Best First'.
Re^2: when do you stop writing test?
by NatureFocus (Scribe) on Feb 12, 2007 at 17:23 UTC

    I agree with Moron about the simplicity of design. Poor design can lead to overly complex systems that can lead to obscure bugs. Write solid code. KIS - Keep It Simple.

    Also, many times you will not have the time to do a lot of testing because of deadlines, resource shortages, or marketplace demands, so you need to make your testing count. Test what is important first, then work your way through the rest.

    One thing that we found important is to have a non-programmer do some of the final testing (validation). Most programmers won't try hard enough to break their own code. A good validation engineer will put negative numbers in the wrong field, press 257 control-c in a row, paste chapter 7 of "War and Peace" into a field, or blast DTMF into the ear of an operator to get them to hangup or do any number of things that you would never think of. However do not rely on them for all the testing; you need to deliver to them a good working product. Their job is to find things that you didn't think of.

    -Eugene