There are many people arguing specific semantics of the authors arguments, but I believe the core problem is C and C++ both dramatically overusing “undefined” vs “unspecified”.
The difference is huge. Signed integer overflow is (per spec) undefined behaviour, so an obvious bounds check if UB and so can be removed. If it were unspecified the compiler would be required to be at least self-consistent. Eg it couldn’t do 2s complement in one place, but then treat arithmetic as not being 2s complement elsewhere (the overflow checks). E.g if the compiler emits code where MAX_INT+1 is MIN_INT, then the compiler can’t also pretend that that doesn’t happen.
Undefined should be reserved solely for things that cannot have a specified behavior (UaF, OoB memory, IO weirdness, etc).
While that looks stupid in isolation (and I agree it's annoyingly hard to check for overflow, although gcc and clang have special builtins nowadays to do it), it turns out there are important reasons for that optimisation.
In general, knowing that 'a+1' is '1 larger than a' allows for lots of optimisations, when writing to an array in order we can vectorise, do things in bigger chunks, all sorts of useful and important optimsiations. If every time those were used the compiler had to check for overflow, it would seriously effect performance.
Written under the premises of "performance trumps all".
This is why it getting rid of the underlying software written in C should be a concern, or at very least, adopt hardware and development practices that tame C. After all UNIX/POSIX clones won't get replaced overnight.
Butchers that care for their hands also make use of protective gloves when dealing with sharp knives.
In practice, a lot of software is written in C because it depends on interfaces that are defined in terms of their C APIs, without caring all that much about performance.
No C library is changing their ABI every couple of seconds, and plus many of those tools understand C header files, quite feasible to fix broken bindings every now and then.
The problem isn't changes, it's accommodating multiple versions. Even figuring out where to find headers is not necessarily easy if you're not the local C compiler, for whom the tooling must only begrudgingly exist.
The difference is huge. Signed integer overflow is (per spec) undefined behaviour, so an obvious bounds check if UB and so can be removed. If it were unspecified the compiler would be required to be at least self-consistent. Eg it couldn’t do 2s complement in one place, but then treat arithmetic as not being 2s complement elsewhere (the overflow checks). E.g if the compiler emits code where MAX_INT+1 is MIN_INT, then the compiler can’t also pretend that that doesn’t happen.
Undefined should be reserved solely for things that cannot have a specified behavior (UaF, OoB memory, IO weirdness, etc).