|No such thing as a small change|
You obviously know more about things unicode than I--I've had barely any reason to use them--so I'll ask you:
Is there no possibility that when encoding a string to one of the many forms of unicode for output to an external system that there might legitimately be null bytes embedded within the string?
If there isn't, then detecting and warning of embedded nulls would only require a single pass of every scalar passed to a system api looking for nulls.
If there is--and I feel sure that some of the MS wide character sets contain some characters where one half of the 16-bit values can be null, but I don't have proof yet--then it would require two passes in order to ensure against false positives causes spurious warnings/dies.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.