I'd say binary considerations are incompatible with risk calculations because no person and no procedure is perfect nor perfectly followed, nothing is completely bug free, etc. Some small part of a calculation might appear binary, but other terms usually dominate.
Risk calculations are far broader than infosec and don't deserve the dismissiveness you seem to be casting towards them. Risk calculations are the core of business. Almost every decision a business makes is a risk calculation; every action has an opportunity cost if it isn't intrinsically risky, and actions with certainty are very rare.
(For the avoidance of doubt, I believe that use of memory-unsafe languages should be avoided if reasonably possible, but there are still plenty of reasonable reasons to use C, C++ etc. instead.)
I'm not trying to dismiss risk calculations. I appreciate them and challenges involved, having worked on tools supporting risk calculations in corporate space.
I feel this thread is getting out of hand. I initially replied to say why, in general, the kind of thinking that makes you unsatisfied with reductions but not eliminations of concerns, is common among programmers - because it's a sound heuristic. Reducing is good, but eliminating is better.
Risk calculations are far broader than infosec and don't deserve the dismissiveness you seem to be casting towards them. Risk calculations are the core of business. Almost every decision a business makes is a risk calculation; every action has an opportunity cost if it isn't intrinsically risky, and actions with certainty are very rare.
(For the avoidance of doubt, I believe that use of memory-unsafe languages should be avoided if reasonably possible, but there are still plenty of reasonable reasons to use C, C++ etc. instead.)