If I understood the logic of your sodium system example, I think the point was that it would have been inappropriate to fully automate the control process and eliminate human oversight. There would always need to be a person overseeing the process and responding to information supplied by the control system. I completely agree with that concept. In spite of claims made by some regarding the state of artificial intelligence, software development has not reached the point where programs can be substituted completely for human oversight in systems where failure results in the serious consequences that you describe.
I think he did mention something about having a human operator "in the loop" in this case (it's been a long time ago).
Now if it had been a chemical engineering class, there might have been some discussion of how such a potentially catastrophic process came about in the first place and how to design the process itself to be safer, but that wasn't such a class.