In article <530EA5CD.2070508_at_protected-networks.net>, imb_at_protected-networks.net writes: ><sigh> .. way back in the late 70's or maybe early 80's when I was >actually doing some work on compilers, we had a saying: "produce correct >code even if it's not optimal or exit and tell the user why". > >Producing non-working code for no apparent reason and without warning is >counter-productive. It wastes everyone's time :-( When the specification of "correct" says that anything can happen, including but not limited to the program crashing at runtime, there's no limit on what the compiler may emit. (On the other hand, I agree that if the compiler emits a "crashme" instruction, it really ought to generate a diagnostic as well, even if it can't explain why.) Originially this "escape hatch" was intended to apply to conditions that the compiler could not detect (or at least, the historical PCC could not detect), but nowadays the compiler writers take it upon themselves to deliberately break programs that involve undefined behavior, even when there is an entirely sensible way to define the behavior which is consonant with the way the machine architecture works and how historical compilers have implemented the same construct. For example, the following program: #include <limits.h> extern long l; int main(void) { l = LONG_MAX; l++; return l > 0; } ..is permitted to crash, but it's also permitted to do nothing, and it's permitted to set l to LONG_MIN (following the normal two's-complement arithmetic on signed values). The compilers I checked actually did the obvious thing, but they are older versions. -GAWollmanReceived on Thu Feb 27 2014 - 16:04:33 UTC
This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:40:47 UTC