FROM THE U.S.A. call 901-313-4312
UK time is now:
Dec 032015
 

I was trying to avoid some work, and stumbled across this post (republished with permission) on one of the LinkedIn discussions. If you are at all interested in software reliability, Les Hatton is someone to respect.

I was lost on LinkedIn (not for the first time) and spotted this discussion.

C v C++. Ah yes, we’ve been debating this for 25 years. C++ is absolutely NOT always better. It is just different.

First of all, the programming language appears to be irrelevant in most empirical studies of injected defect, implemented size and similar behaviour – the most significant factor by a long way remains the quality of the engineers producing the system. However, this disguises an unpleasant truth about OO in general and C++ in particular.

I first studied and published evidence on this in 1997 in IEEE journals. The result of the original studies was a systematic bias in C++ towards significantly LONGER defect correction times. In other words, when you make a mistake in C++, you really pay for it. If the use of C++ led to less defects per implemented functionality, we might be able to live with this but there is no evidence that it does.

Indeed, one of the unpleasant side-effects of the OO paradigm is that it appears to delay the detection of certain classes of defect to much later in the life-cycle where they become really expensive to find and fix, (particularly in embedded systems). For example, in my original studies, the use of C++ increased the cost of finding and fixing defects during system testing by a factor of 4 on comparable systems. Inheritance, (single or multiple) in particular appears to be a defect attractor.

Other promises of OO have failed to materialise. We saw it as producing a universal toolset which would allow us to literally bolt together new reliable systems from tried and trusty components, free of unpleasant side-effect. This has not proved the case. Indeed the component size distribution of C++ systems has exactly the same form (a power law) as C, (and Ada, Fortran, Tcl-Tk, Matlab, Java and everything else I have looked at – see my website for a study of around 60 million lines of code). This turns out to be inevitable from information theory via the clockwork theorem.

So, as I mentioned at the start, it is just a different paradigm – neither better, nor worse. The best advice I could give after 25 years of experiments is to let your programmers use whatever language they are fluent in. If they are not fluent in any, (several years at least), you are in for a rough ride. There isn’t much else to go on because we are not in general a critical or even a scientific discipline when it comes to accumulating compelling evidence.

The seeming irrelevance of programming language is actually a great relief to me. If we even consider the possibility that one symbolic representation of a piece of functionality can be superior to another different representation on average, then we must admit the possibility that a Chinese engineer writing about science in Chinese can be systematically better or worse than a German engineer writing about the same subject in German, simply because of the language they are using. This seems to me unconscionable.

Parenthetically, one last thing you might worry about in the months and years to come is that both C and C++ have undergone a pathologically complicated re-standardisation in the last few years producing standards which no one person can understand, (well I’m damned if I can anyway). Neither have served the embedded system user well since the abstract model of computation in both languages leaves both time and space effectively undefined in the rush to add exotic features of unknown reliability. The last draft of the C11 standard before you have to pay for it is 678 pages, nearly 4 times bigger than the C90 standard which we generally understood. This is still dwarved by the leviathan C++11 standard, the last version of which I saw was just under 1400 pages.

If you add to this that most CS students are now only taught Java in any depth, there is a crying need for more emphasis on education as opposed to technology.

leshatton.org