in reply to Re^2: (OT) Tracking Issues, Requirements, Tests
in thread (OT) Tracking Issues, Requirements, Tests

Ok, so not software tests. Well, then that goes to my point about the wiki- each of these platforms has a version controlled wiki and you could easily set up a standard location for all real-world testing plans to be documented. If you want them in the same repo as the code, then just designate a subdirectory for it and write them in Markdown, which renders nicely and is easy to edit from the web interface.
  • Comment on Re^3: (OT) Tracking Issues, Requirements, Tests

Replies are listed 'Best First'.
Re^4: (OT) Tracking Issues, Requirements, Tests
by afoken (Chancellor) on Apr 10, 2025 at 07:45 UTC
    [...] version controlled wiki [...] for all real-world testing plans to be documented [...]

    It's not just documenting the tests. It's also about traceability. You don't write tests like a poem, "inspired" by the requirements. Each and every requirement needs to have at least test. "Test" may be a lab test, software reading, datasheet reading (e.g. if a requirement demands UL-listed parts). In the end, you end with a lot of tests, each test verifies at least one requirement. You have test plans, grouping tests reasonably (i.e. you don't mix lab tests and software reading). And you have test executions, documenting that a test plan was executed partially or completely, including the test results. All of that can be traced back to requirements.

    Yes, it can be done in a wiki or in Excel. We did it in Excel. It sucked. Really. Starting with the fact that Excel documents can only be edited on a single machine at a time. A wiki would have improved that, but still, you would have to do all tracing manually. In that regard, Jira + R4J + T4J is a huge improvement. Three people executing different test plans in parallel is no problem, checking the traces required just a few mouse clicks instead of hours of clicking through MS Office documents. And if you reach 100% test execution, a few more mouse clicks export documents listing each test, its execution(s), and the links to the requirements. That can be added to the project documentation, making the client's auditors happy. And ours, because we could trace any single test result back to the requirements directly in the web browser. It really impresses auditors.

    (And no, we don't do that just to make auditors happy. It's a nice side effect.)

    Alexander

    --
    Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)