And note: complex, urgent problems have to be dealt with collectively!
I happen to believe that if we don't increase our collective capabilities significantly ― for recognizing, understanding, and coping with complex, urgent problems much of our civilization will be at risk of crashing and burning.
The scale of this challenge is so large that it can only be pursued directly and effectively by following an appropriately scaled improvement strategy.
That's what I'll be describing ― my proposed strategy. Call it Bootstrapped, Facilitated Evolution.
And then I'll ask why the OOP people, who seem to have developed such a superior way to deal with information objects, haven't already solved this. Hmm, perhaps they'll join the pursuit?
On December 9, 1968, Douglas C Engelbart and a team of software
developers gave the first public demonstration of a computer with a
windows interface, videoconferencing, black on white text,
context-sensitive help, and a mouse. They delivered this demo to 4000
stunned spectators at the Falls Joint Computer Conference in San
Francisco. Although the demo system linked to a remote mainframe
computer, it sparked research that led to the development of personal
computers, the graphical user interface, and more-advanced
networks. It launched a disruptive revolution in the way people work,
communicate and produce. If not for Douglas Engelbart, many of the
technical innovations we consider vital to the personal computer
revolution would not exist.
In my talk, I will compare and contrast the two ideas of programming and programming language design. I will present and defend the thesis that good object-oriented programming heavily "borrows" from functional programming and that the future of object-oriented programming is to study functional programming and language design even more.
Matthias Felleisen's career consists of two parts. For the first ten years (1984–1994), he focused on the development of a new form of operational semantics and used this semantics to study design issues in mostly functional programming languages. His form of operational semantics, often dubbed evaluation context semantics, has become the standard tool for studying the well-definedness of programming languages (aka type soundness theorem). His work on continuation-based control constructs and calculi of control has spawned small areas of investigation in both control constructs and logic.
In 1994, Felleisen and his research group (PLT) began to work on the development of a programming environment for novice programmers (DrScheme). They use this software (and a curriculum they developed in parallel) to inject true design principles into the introductory programming curriculum. They use the software development project to study problems in programming languages, software engineering, and operating systems. Over the past ten years, Felleisen and his collaborators have published numerous papers on object-oriented design patterns, the nature of classes and mixins, the interaction between classes and modules, extensibility in functional and OO programs, and other matters of objects.
Felleisen spent most of his career at Rice University, with short
sabbaticals at Carnegie Mellon University (Pittsburgh) and Ecole
Normale Superieure (Paris). He is now a Trustee Professor at
Northeastern University, Boston.
We present richer notions of interfaces, which expose in addition to type information, also temporal information about software modules. For example, the interface of a file server with the two methods open file and read file may stipulate that read file must not be called before open file has been called. Symmetrically, then, the interface of a client must specify the possible sequences of open file and read file calls during its execution, so that a compiler can check if the server and the client fit together. Such behavioral interfaces, which expose temporal information about a module and at the same time impose temporal requirements on the environment of the module, can be specified naturally using an automaton-based language. In other situations, the appropriate notion of compatibility between software modules, as suggested by the first principle of interface design, is richer still and may require, for example, the exposure of assertional, real-time, and resource-use information. This leads, in turn, to push-down, timed, and resource interfaces. For instance, resource interfaces can be used to ensure that no two modules simultaneously access a unique resource.
We formally capture the requirements on interfaces by axiomatizing interface theories. For example, the axiom of ïndependent implementability"of interfaces guarantees that if A and B are compatible interfaces, and A0 is a module that conforms to interface A, and B0 is a module that conforms to interface B, then the composition A0||B0 of the two modules conforms to the composite interface A||B. For selected interface formalisms, such as behavioral, push-down, timed, and resource interfaces, we show that they satisfy the axioms of interface theories, and we discuss the following three algorithmic problems:
Tom Henzinger is a Professor of Electrical Engineering and Computer Sciences at the University of California, Berkeley. He holds a Dipl.-Ing. degree in Computer Science from Kepler University in Linz, Austria, an M.S. degree in Computer and Information Sciences from the University of Delaware, and a Ph.D. degree in Computer Science from Stanford University (1991). He was an Assistant Professor of Computer Science at Cornell University (1992–95), and a Director of the Max-Planck Institute for Computer Science in Saarbruecken, Germany (1999).
His research focuses on modern systems theory, especially formalisms and tools for the component-based and hierarchical design, implementation, and verification of embedded, real-time, and hybrid systems. His HyTech tool was the first model checker for mixed discrete-continuous systems.