Preface

Copyright (c) 2000, Morgan Kaufmann Publishers


A course in computer programming provides the typical student's first exposure to the field of computer science. Most of the students in such a course will have had previous exposure to computers, in the form of games and other personal computer applications, but it is not until they write their own programs that they begin to appreciate how these applications work. After gaining a certain level of facility as programmers (presumably with the help of a good course in data structures and algorithms), the natural next step is to wonder how programming languages work. This book provides an explanation.

In the conventional ``systems'' curriculum, the material beyond data structures (and possibly computer organization) is compartmentalized by sub-area, with courses in programming languages, compilers, computer architecture, operating systems, database management systems, and possibly software engineering, graphics, or user interface systems. One problem with this approach is that many of the most interesting things in computer science occur at the boundaries between these subareas. The RISC revolution, for example, has forged an intimate alliance between computer architecture and compiler construction. The advent of micro-kernels has blurred the distinction between the operating system kernel and the language run-time library. Aggressive memory systems for supercomputers are re-defining the relative roles of the operating system, the compiler, and the hardware. And programming language design has always been heavily influenced by implementation issues. Increasingly both educators and researchers are recognizing the need to focus on these interactions.

Another problem with the compartmentalized curriculum is that it offers more courses than the typical undergraduate can afford to take. A student who wants to gain a solid background in theory, artificial intelligence, numerical methods, or various allied fields cannot afford to take five upper-level courses in systems. Rather than give the student an in-depth look at two or three relatively narrow sub-areas, I believe it makes sense to provide an integrated look at the most fundamental material across subareas.

At its core, Programming Language Pragmatics is a book about how programming languages work. It is in some sense a mixture of traditional texts in programming languages and compilers, with just enough assembly-level architecture to accommodate the student who has not yet had a course in computer organization. It is not a language survey text: rather than enumerate the details of many different languages, it focuses on concepts that underlie all of the languages the student is likely to encounter, illustrating those concepts with examples from various languages. It is also not a compiler construction text: rather than explain how to build a compiler (a task few programmers will ever need to tackle in its entirety, though they may use front-end techniques in other tools), it explains how a compiler works, what it does to a source program, and why. Language design and implementation are thus explored together, with attention to the ways in which they interact. When discussing iteration, we can see how semantic issues (what is the scope of an index variable? what happens if the body of a loop tries to modify the index or loop bounds?) have interacted with pragmatic issues (how many branch instructions must we execute in each iteration of the loop? how do we avoid arithmetic overflow when updating the index?) to shape the evolution of loop constructs. When discussing object-oriented programming, we can see how the tension between semantic elegance and implementation speed has shaped the design of languages such as Smalltalk, Eiffel, C++, and Java.

In the typical undergraduate curriculum, this book is intended for the programming languages course. It has a bit less survey-style detail than certain other texts, but it covers the same breadth of languages and concepts, and includes much more information on implementation issues. Students with a strong interest in language design should be encouraged to take additional courses in such areas as formal semantics, type theory, or object-oriented design. Similarly, students with a strong interest in language implementation should take a subsequent course in compiler construction. With this book as background, the compiler course will be able to devote much more time than is usually possible to optimization and code generation, where most of the interesting work these days is taking place.

At the University of Rochester, the material in this book has been used for about ten years to teach a course entitled ``Software Systems''. The course draws a mixture of mid- to upper-level undergraduates and first-year graduate students. The book should also be of value to professional programmers and other practitioners who simply wish to gain a better understanding of what's going on ``under the hood'' in their favorite programming language. By integrating the discussion of syntactic, semantic, and pragmatic (implementation) issues, the book attempts to provide a more complete and balanced treatment of language design than is possible in most texts. The hope is that students will come to understand why language features were designed the way they were, and that as programmers they will be able to choose an appropriate language for a given application, learn new languages easily, and make clear and efficient use of any given language. In most chapters the concluding section returns to the theme of design and implementation, highlighting interactions between the two that appeared in preceding sections. In addition, appendix B contains a summary list of interactions, with references to the sections in which they are discussed. These interactions are grouped into several categories, including language features that most designers now believe were mistakes, at least in part because of implementation difficulties; potentially useful features omitted from some languages because of concern that they might be too difficult or slow to implement; and language features introduced at least in part to facilitate efficient or elegant implementations. Some chapters (2, 4, 5, 9, and 13) have a heavier emphasis than others on implementation issues. These can be reordered to a certain extent with respect to the more design-oriented chapters, but it is important that 5 or its equivalent be covered before chapters 6, 7, or 8. Many readers will already be familiar with some of the material in chapter 5, most likely from a course on computer organization. In this case the chapter can easily be skipped. Be warned, however, that later chapters assume an understanding of the assembly-level architecture of modern (i.e. RISC) microprocessors. Some readers may also be familiar with some of the material in chapter 2, perhaps from a course on automata theory. Much of this chapter can then be read quickly, pausing perhaps to dwell on such practical issues as recovery from syntax errors.

For self-study, or for a full-year course, I recommend working through the book from start to finish. In the one-semester course at Rochester, we also cover most of the book, but at a somewhat shallower level. The lectures focus on the instructor's choice of material from the following chapters and sections: 1, 2.1-2.2.3, 3, 4.1-4.5, 6-8, 9.1-9.3, and 10-12. Students are asked to read all of this material except for the sections marked with an asterisk in the table of contents. They are also asked to skim chapter 5; most have already taken a course in computer organization.

For a more traditional programming languages course, one would leave out 2.2, 4, 5, and 9, and de-emphasize the implementation-oriented material in the remaining chapters, devoting the extra time to more careful examination of semantic issues and to alternative programming paradigms (e.g. the foundational material in chapter 11). For a school on the quarter system, one appealing option is offer an introductory one-quarter course and two optional follow-on courses. The introductory quarter might cover 1, 2.1-2.2.3, 3, 6, 7, and 8.1-8.4. A language-oriented follow-on quarter might cover 8.5-8.6, 10-12, and possibly supplementary material on formal semantics, type systems, or other related topics. A compiler-oriented follow-on quarter might cover 2.2.4-2.3, 4, 5 (if necessary), 9, 13, and possibly supplementary material on automatic code generation, aggressive code improvement, programming tools, etc. One possible objection to this organization is that it leaves object orientation and functional and logic programming out of the introductory quarter. An alternative would be to start with a broader and more exclusively design-oriented view, moving 1.4-1.6 and 2.2.1-2.2.3 into the compiler-oriented quarter, de-emphasizing the implementation-oriented material in 6-8, and 10.1-10.4, 10.6, and the non-foundational material in 11.

I assume that the typical reader already has significant experience with at least one high-level imperative programming language. Exactly which language it is shouldn't matter. Examples are drawn from a wide variety of languages, but always with enough comments and other discussion that they should be easy for readers not familiar with the language to understand. Algorithms, when needed, are presented in an informal pseudo-code that should be self-explanatory.

Each of the chapters ends with review questions and a set of more challenging exercises. Particularly valuable are those exercises that direct students toward languages or techniques that they are unlikely to have encountered elsewhere, or to encounter elsewhere soon. I recommend programming assignments in C++ or Java; Scheme, ML, or Haskell; and Prolog. An assignment in exception handling is also a good idea; it may be written in Ada, C++, Java, ML, or Modula-3. If concurrency is covered, an assignment should be given in SR, Java, Ada, or Modula-3, depending on local interest. Sources for language implementations are noted in appendix A.

In addition to these smaller projects (or in place of them if desired), instructors may wish to have students work on a language implementation. Since building even the smallest compiler from scratch is a full-semester job, students at Rochester have been given the source for a working compiler and asked to make modifications. For many, this is their first experience reading, understanding, and modifying a large existing program---a valuable exercise in and of itself. The Rochester PL/0 compiler translates a simple language due to Wirth into MIPS I assembly language, widely considered the ``friendliest'' of the commercial RISC instruction sets. An excellent MIPS interpreter (``SPIM'') is available from the Computer Science Department at the University of Wisconsin (http://www.cs.wisc.edu/~larus/spim.html). Source for the compiler itself is available from Rochester (ftp://ftp.cs.rochester.edu/pub/packages/plzero/). It is written in C++, with carefully separated phases and extensive documentation.

Michael L. Scott
Rochester, NY
March 1999


Back to the book home page

Last Change: 29 October 2005 / Michael Scott's email address