why not just create an issue for adding a test, write it and add it to the automated test suite, and then resolve the issue?

Because we track the entire product, including its documentation, electronics, mechanics, pneumatic (if any). There are tests that can not be automated, at least not for a sane price. One of our current projects has a container for some liquid, with a level sensor and a tilt sensor. The container has a manually operated draining valve and - for development and testing - a simple funnel on the top side. (A little bit like the fuel tank on old motor bikes, but for a completely different purpose.) I don't know the exact test spec yet, but I bet you will find instructions like these in the test plans:

  1. ...
  2. Fill ten volume units of liquid into the container
  3. Wait three time units
  4. Check the if the volume displayed by the device is between 9.5 and 10.5 units.
  5. Tilt the container by five angle units.
  6. Wait one time unit
  7. Check if the device issued a tilt alarm
  8. Return the container to level
  9. Wait one time unit
  10. Check that the alarm has turned off
  11. Drain five volume units out of the container
  12. Wait one time unit
  13. Check that the device issued a leakage alarm
  14. ...

Yes, these are very detailed baby steps of actually using (or misusing) the device under test. Using your fingers, ears, eyes, and a little bit of that grey filling material between the ears. Pretend to be a smart or dumb user.

Yes, you could automate that, using a machine three times as complex as the device under test. Or just tell an intern to open the test plan #47893 in a browser and to follow the instructions.

These tests are usually executed and documented ONCE for the entire device, before handing it over to the client. The test instructions and results are part of the documentation that is handed to the client. Maybe after year or two, hardware and/or software are modified to better match the client's needs, tests are modified to match the new requirements, and then, those tests are run ONCE to confirm that all requirements are still fulfilled.

So even just thinking about automating them is way too expensive. Interns are way cheaper than automating those tests. Even if they completely f*** up a device under test.

Another part of the tests is simply reading the source code (or schematics, layouts, hardware drawings). Compiler warnings are great, and lint find a lot of extra mess-ups, but having another developer look at the code (or schematics, plans) can find a lot of those nasty edge cases everybody hates to debug.

Alexander

--
Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)

In reply to Re^2: (OT) Tracking Issues, Requirements, Tests by afoken
in thread (OT) Tracking Issues, Requirements, Tests by afoken

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post, it's "PerlMonks-approved HTML":



  • Posts are HTML formatted. Put <p> </p> tags around your paragraphs. Put <code> </code> tags around your code and data!
  • Titles consisting of a single word are discouraged, and in most cases are disallowed outright.
  • Read Where should I post X? if you're not absolutely sure you're posting in the right place.
  • Please read these before you post! —
  • Posts may use any of the Perl Monks Approved HTML tags:
    a, abbr, b, big, blockquote, br, caption, center, col, colgroup, dd, del, details, div, dl, dt, em, font, h1, h2, h3, h4, h5, h6, hr, i, ins, li, ol, p, pre, readmore, small, span, spoiler, strike, strong, sub, summary, sup, table, tbody, td, tfoot, th, thead, tr, tt, u, ul, wbr
  • You may need to use entities for some characters, as follows. (Exception: Within code tags, you can put the characters literally.)
            For:     Use:
    & &amp;
    < &lt;
    > &gt;
    [ &#91;
    ] &#93;
  • Link using PerlMonks shortcuts! What shortcuts can I use for linking?
  • See Writeup Formatting Tips and other pages linked from there for more info.