Update 5/14/2015: Here is a short video that was made for the Fellow Award Ceremony.
Update 4/9/2015: A video of the interview is now available. It’s searchable via the synchronized transcript.
In February I had the honor of conducting an oral history of Bjarne Stroustrup for the Computer History Museum, on the occasion of his being one of three 2015 Fellow Awards Honorees (the other two being Evelyn Berezin and Charles Bachman).
Programming languages have emerged over the last fifty or so years as one of the most important tools allowing humans to convert computers from theoretical curiosities to the foundation of a new era. By inventing C++ and working tirelessly to evolve, implement, standardize, and propagate it, Bjarne has provided a foundation for software development across the gamut from systems to applications, embedded to desktop to server, commercial to scientific, across CPUs and operating systems.
C was the first systems programming language to successfully abstract the machine model of modern computers. It contained a simple but expressive set of built-in types and structuring methods (structs, arrays, and pointers) allowing efficient access to both the processor and the memory. C++ preserves C’s core model, but adds abstraction mechanisms allowing programmers to maintain efficiency and control while extending the set of types and structures.
While several of the individual ideas in C++ occurred in earlier research or programming languages, Bjarne has synthesized them into an industrial-strength language, making them available to production programmers in an coherent language. These ideas include type safety, abstract data types, inheritance, parametric polymorphism, exception handling, and more.
In addition to its synthesis of earlier paradigms, C++ pioneered a thorough and robust implementation of templates and overloading. This has enabled the field of generic programming to advance from early research to sufficient maturity for inclusion in the C++ standard library in the form of STL. Additional generic libraries for graphs, matrix algebra, image processing, and other areas are available from other sources such as Boost.
By combining efficiency and powerful abstraction mechanisms in a single language, and by achieving ubiquity across machines and operating systems, C++ has replaced a variety of more specialized languages such as C, Ada, Smalltalk, and even Fortran. Programmers need to learn fewer languages; platform providers need to invest in fewer compilers. Even newer languages such as Java and C# are clearly influenced by the design of C++.
C++ has been the major focus of Bjarne’s career, and he has had unprecedented success at incremental development of C++, keeping it stable enough for use by real programmers while gradually evolving the language to serve the broadest possible audience. As he evolved the language, Bjarne took on many additional roles in order to make C++ successful: author (books, papers, articles, lectures, and more), teacher, standardization leader, and marketing person.
Bjarne started working on C with Classes, a C++ precursor, in 1979; the first commercial release of C++ was in 1985; the first standard was in 1998, and a major revision was completed in 2011. C++ was mature enough by 1993 to be included in the second History of Programming Languages conference, and a follow-on paper focusing on standardization appeared in the third HOPL conference in 2007.
I hope you enjoy reading the oral history as much as I did interviewing Bjarne.
See also: C++ Historical Sources Archive, Dusty Decks, 11 June 2007.
Update 1/30/2016: Updated URL for C from http://cm.bell-labs.com/cm/cs/cbook/index.html to https://9p.io/cm/cs/cbook/index.html.
Update 9/10/2024: Updated several more URLs.
“C was the first systems programming language to successfully abstract the machine model of modern computers.”
Hardly. Of course, this depends on your exact definition of ‘abstract’. Abstraction actually means getting away from details of the machine. C does not do a good job of that at all, exposing details in all sorts of messy ways. In fact, I find the phrase “abstract the machine model of modern computers” contradictory.
“Successfully” – successful by which measure?
ALGOL certainly beat C by more than 10 years and it really did abstract away from the machine.
Of course, maybe C was successful in terms of becoming widespread, whereas the vastly superior ALGOL was basically only used (and still is) for Burroughs system software, in a far more elegant, secure, and correct implementation.
Modern? That is also somewhat undefined. The machine model used by C is very much based in the PDP-8 machine model. Debatable whether that is modern. After all most current (rather than modern) machines are very flawed when it comes to security.
C and C++ propagate those security problems and the closely related topic of software correctness.
Ian,
> > “C was the first systems programming language to successfully abstract the machine model of modern computers.”
> Hardly. Of course, this depends on your exact definition of ‘abstract’. Abstraction actually means getting away from details of the machine. C does not do a good job of that at all, exposing details in all sorts of messy ways. In fact, I find the phrase “abstract the machine model of modern computers” contradictory.
A useful abstraction exposes some useful interface while ignoring details irrelevant to that interface. C exposes the byte addressing, multiple data types of various sizes and alignments (e.g., char, int, long, float, double), and appropriate address arithmetic to support the hardware architectures that began with the IBM System 360 and became pervasive while hiding the assembly-language details (e.g., how many index registers are there? what are the exact addressing modes?).
> “Successfully” – successful by which measure?
By the measure of adoption: widespread use for operating systems, database management systems, compilers, and other system and application software over many decades.
> ALGOL certainly beat C by more than 10 years and it really did abstract away from the machine.
I’m somewhat familiar with ALGOL (see http://www.softwarepreservation.org/projects/ALGOL). It’s elegant, but does not provide the above-mentioned machine-level interface.
> Of course, maybe C was successful in terms of becoming widespread, whereas the vastly superior ALGOL was basically only used (and still is) for Burroughs system software, in a far more elegant, secure, and correct implementation.
Successful means “succeeded in the marketplace of ideas and commerce”; “was widely used”.
> Modern? That is also somewhat undefined. The machine model used by C is very much based in the PDP-8 machine model. Debatable whether that is modern. After all most current (rather than modern) machines are very flawed when it comes to security.
I don’t believe Ritchie and Thompson were influenced by the PDP-8. They were familiar with BCPL, which came from word-oriented computers like the IBM 7090 family, but they were porting Unix to new byte-oriented computers. Again, modern means what is in use now: Intel, ARM, PowerPC, MIPS, Sparc, etc.
> C and C++ propagate those security problems and the closely related topic of software correctness.
Yes, I don’t believe I said C and C++ solve all problems. Of recent developments, I think Rust shows promise.
Paul
Hi Paul,
Thanks for your reply. I noticed last week you were involved in ALGOL preservation and that both Nigel Williams and Paul Kimpel were mentioned. Paul I have had dinner with, and unfortunately, Nigel was not available on a recent visit down to Tasmania.
I thought your measure of success was probably adoption. As you know ALGOL was successfully used as a systems language in the B5000 way before C. These machines were quite widespread in the 1960s, but Unix became popular, I think more due to popular culture than for technical reasons (although Burroughs were hopeless marketers).
As for IBM 360 and 7090, I think these are horrible architectures. The best thing that came out of 360 was Fred Brooks Mythical Man Month. Fundamentally, I think the most widespread architectures and their languages like C are broken.
Yes, I agree Thompson and Ritchie would have been influenced by BCPL, but constrained by the PDP-8 on which C and Unix were first constructed. One of my mentors did a BCPL system (in microcode) for the B1700. He said after this he was convinced that languages such as BCPL with byte, address, and other machine-oriented abstractions were the wrong way to go. He helped me with a microcoded C system also on the B1700. I’m also convinced that machine level abstractions (such as bits, bytes, and addresses – since to most people I have to point out, that yes, even these are abstractions) are the wrong level of abstraction, even for systems programming.
But then I have seen that pure higher-level abstractions can work, and work a lot better in the production of correct and – even more importantly in 20teens – security. So I don’t like current architectures and languages – and that is hardly a popular position to take.
Thanks for your comments. I’m glad we could connect. It sounds like you do interesting stuff.
Ian
> Yes, I agree Thompson and Ritchie would have been influenced by BCPL, but constrained by the PDP-8 on which C and Unix were first constructed.
You need to do your homework; I suggest you read Ritchie’s “The Development of the C Programming Language” from the second ACM SIGPLAN History of Programming Languages Conference in 1993 and available here: https://www.bell-labs.com/usr/dmr/www/chist.html
The first version of Unix was on an 18-bit word-addressed PDP-7, not the 12-bit PDP-8. Then Unix was ported to the 16-bit byte-addressed PDP-11. Ritchie describes how C evolved from B, which evolved from BCPL.
OK, thanks for the clarification that it was the PDP-7, not the 8. I’ll take the paper you cite as definitive, but others do say the PDP-8:
http://oceanpark.com/papers/unix.html
So, it is not exactly that I need to do homework, and that small point does not disprove the point I make there saying that C was constrained by the PDP-7. What I mean is that was a very small machine with not much room to develop a sophisticated compiler. Thus C was constrained.
As you would know, many would not consider ALGOL and viewed it as too hard to implement (apart from people like Donald Knuth on a small Bx00, Tony Hoare at Elliott, and then Bob Barton and team). Thus many aspects that the ALGOL definition demanded the compiler do were pushed back onto the programmer. While I think programmers should understand that level, actually doing it in practice makes C a very tedious and brittle language (you can’t make one small change without the programmer going and changing a whole lot, whereas in a true HLL, you’d just make one change and recompile).
C also seems to have got #defines from Burroughs ALGOL which had a better define … #. I’m not sure where Burroughs got it from, perhaps from Elliott? Maybe you know some history about that.
You might be interested in my Burroughs page:
http://ianjoyner.name/Burroughs.html
especially Richard Waychoff’s Stories of the B5000 and People Who Were There.
This is much closer to the start of computing and foundations than C.
Ian
> OK, thanks for the clarification that it was the PDP-7, not the 8. I’ll take the paper you cite as definitive, but others do say the PDP-8:
http://oceanpark.com/papers/unix.html
Did you actually read the article you mention? It starts out talking about the PDP-8, but then jumps to this not-quite-correct statement (that omits the PDP-7):
“And it just so happened that at AT&T Bell Labs, a young hacker named Ken Thompson began to create a new timesharing operating system on an idle PDP-11.”
In general, it is best to start with the papers written by the principals (Thompson and Ritchie in this case) rather than by someone who hides behind a pseudonym and does not include any references.
Hi Paul,
I was just using that web page as an example that PDP-8 is cited, not the PDP-7. If the PDP-7 is correct, thank you for the correction. I’m quite happy to be corrected on that.
However, that detail does not rebut my point about abstraction. There are two schools of thought – that languages should support machine-oriented abstractions (bits, bytes, etc, from BCPL) or problem-oriented abstractions, where machine details are masked by language definition and compilers. Dijkstra noted this is a difference between European computing and US computing, where US computing was much more about electronics.
While the machine-oriented approach might still be more popular, I think problem-oriented abstractions are much more powerful, have longevity, and are the right way to go.
Ian
OK, I just managed to find my paper copy of Ritchie’s “The Evolution of the Unix Time-sharing System”, (this copy printed 12/7/98). Indeed it says PDP-7, so it must have been my faulty memory.
Ah, yes file systems. I think the hierarchical directories also came from the B5000 MCP. When first implemented they had separate directories, so an arbitrary depth. However, in a commercial environment, traversing the directories to open a file proved slow. So they came up with a flat directory and limited the depth to 14 levels.
Two-letter operator command names were also an MCP feature.
Tracing where these ideas came from and their interrelationships is indeed interesting, and I’m glad you are doing such important work.
Ian
@IanJoyner > There are two schools of thought – that languages should support machine-oriented abstractions (bits, bytes, etc, from BCPL) or problem-oriented abstractions, where machine details are masked by language definition and compilers.
My point in the original post is that C++ combines support of machine-oriented abstractions (which are necessary for many kinds of software) with support for problem-oriented abstractions. See for example _Elements of Programming_ (http://elementsofprogramming.com).
@IanJoyner > I think the hierarchical directories also came from the B5000 MCP.
Can you provide a reference? In their 1974 CACM paper on Unix, Ritchie and Thompson cite Multics and TENEX in the Influences section. By 1970 or so, the idea of hierarchical directories was “in the air” (one system I was personally involved with, the CAL Timesharing System at the University of California at Berkeley, provided them). Burroughs could have been a pioneer, in which case there must be documentation somewhere.
“Can you provide a reference?”
Not off hand. I was hoping computing history museum might have some material. Someone like Paul Kimpel would have earlier material than I have.
I might have a quite old email where one of the old timers replied to my question about why MCP directories were one big flat directory instead of being real hierarchical directories as in Unix. That was the reply I got, that they started as real hierarchical directories, but for speed they made them flat.
Most likely that is because the MCP was meant as a server machine in a well-controlled environment, rather than a user-centric machine where people controlled their own personal work as in Unix.
“My point in the original post is that C++ combines support of machine-oriented abstractions (which are necessary for many kinds of software)”.
I’d counter that point by saying absolutely no machine-oriented abstractions are needed for any kinds of software. That is the basis of computation and Turing machines.
Where you might want machine-oriented abstractions is to control the physical world of electronics. But that as we now see is a dangerous level. Particularly PLC developers have recognised the need to prevent hackers from installing software that can directly affect the physical world. We need to address the issue in basic CPU architecture as well, which would preclude many of the low-level machine-oriented abstractions of C because they are insecure.
Computation is not dependent on electronics and independent of electronic-oriented abstractions. Programmers should not consider machine-oriented abstractions – it is the job of compilers to map problem-oriented abstractions as efficiently as possible to machine-oriented abstractions. Sometimes programmers might be aware of these issues, but that is not ideal. Compiler writers of course must be aware, but language designers should provide only problem-oriented abstractions – that includes for system software. It is in that that C++ is a failure – despite popularity, which is more cultural than technical.
Ian
@IanJoyner > “I’d counter that point by saying absolutely no machine-oriented abstractions are needed for any kinds of software.” You’ve expressed your view. I am closing this thread.
@IanJoyner > “I was hoping computing history museum might have some material.”
This is pretty far off the original subject of this post. The oldest mention of hierarchical directories I am aware of is this 1965 Multics paper:
R. C. Daley and P. G. Neumann. 1965. A general-purpose file system for secondary storage. In Proceedings of the November 30–December 1, 1965, fall joint computer conference, part I (AFIPS ’65 (Fall, part I)). ACM, New York, NY, USA, 213-229. DOI=http://dx.doi.org/10.1145/1463891.1463915
Tom Van Vleck has posted a copy online here: http://www.multicians.org/fjcc4.html . If you come across documentation of an earlier occurrence of this idea I’d be interested in hearing about it. Until then, this thread is closed.