Prospects for Intelligent Simulation

MIT Project on Mathematics and Computation
Artificial Intelligence Laboratory
Massachusetts Institute of Technology

The objective of this research is to demonstrate breakthrough applications that exploit new computer representations and reasoning mechanisms. These mechanisms enable intelligent systems to autonomously design, monitor, and understand complex physical systems, through appropriate mixtures of numerical computing, symbolic computing, and knowledge-based methods. The techniques of intelligent simulation are broadly applicable, because they are based on the intelligent integration of information from mathematics and physics.

Systems incorporating these mechanisms can automatically prepare computational experiments from high-level domain descriptions. They actively monitor computational and physical experiments. They autonomously explore engineering design spaces and distinguish among theoretical explanations of observed phenomena. They automatically propose models of physical systems and match these against measurements and observations.


This work is driven by considering particular problems from science and engineering. Scientific theories and engineering designs are tested by computing their consequences and comparing them with the real world. Intelligent simulation research is directed at automating this process.

Intelligent simulators combine numerical, symbolic, and qualitative methods with computational formulations of mathematical and physical phenomena. They automatically prepare and monitor simulations and automatically interpret results. They carefully choose simulations to survey the range of possible models of unknown systems. By matching the observable details of the experimental situation with these consequences, they identify the model that best accounts for the observations.

Our work has already demonstrated applications to the design of high-performance automatic control systems. Other applications areas include dynamically stabilized structures, precision measurement, and intelligent materials.

We plan to create the system-architectural, algorithmic, and technological foundations for exploiting programmable materials. These are materials that incorporate vast numbers of programmable elements that react to each other and to their environment. Such materials can be fabricated economically, provided that the computing elements are amassed in bulk without arranging for precision interconnect and testing. In order to exploit programmable materials we must identify engineering principles for organizing and instructing myriad programmable entities to cooperate to achieve pre-established goals, even though the individual entities are unreliable and interconnected in unknown, irregular, and time-varying ways. We call this effort the study of amorphous computing.

Look here for an extensive bibliography of our publications.


Look here for latest highlighted accomplishment.

Imagistic reasoning

  • Ken Yip's KAM program explores a dynamical system by visually examining the phase-space trajectories that it discovers by its own numerical experiments. In collaboration with the KAM program, a team of ocean engineers determined some of the conditions that lead to wave instabilities in towing tanks, a new result in hydrodynamics. The joint effort accomplished more in seven weeks than had been accomplished in the previous year with conventional tools.

    Automatic control

  • Feng Zhao's MAPS (Modular Analyzer for Phase Spaces) and PSN (Phase Space Navigator) programs are the core of a system that automatically designs high-performance controllers for nonlinear systems by reasoning about phase-space geometry. This system automatically synthesized a high-performance controller for a magnetic levitation system. The new controller can stabilize maglev vehicles with displacements of up to 20 times those allowed in a previous manually-developed controller for the same system.

  • Liz Bradley's Perfect Moment program exploits chaos and instability to automatically synthesize control systems that achieve very fast target acquisition.

  • Mike Eisenberg's Kineticist's Workbench combines numerical simulation and symbolic methods to characterize chemical reaction mechanisms and suggest ways to simplify complex mechanisms.

    Intelligent Structures

  • Andy Berlin demonstrated that intelligent simulation can help in creating dynamically stabilized structures. Such structures will be sensitive and active, incorporating networks of high-performance controllers. Berlin constructed a prototype column that is actively stabilized by piezo-electric actuators. This column supports 5.6 more load than a passive column of the same size could support. He also demonstrated a truss bridge that uses actively stabilized members to support greater loads than would be possible without active control.

  • Natalya Cohen extended Berlin's work on active structures. Centralized controllers assume both the existence of a global model for the system to be controlled and the availability of global state information. These assumptions fail for complex structures with many sensing sites and many interacting controlled members. Cohen investigated the application of decentralized control to these systems. She constructed controllers, conducted qualitative analysis, simulation, and physical experiments on a prototype beam. She showed that even a very simple, unsophisticated decentralized controller is capable of increasing the load-bearing strength of the beam to the levels comparable to those achieved through the use of centralized control.

    Support for high-speed scientific computing

  • Andy Berlin implemented partial evaluation and parallel scheduling techniques that dramatically improve performance for an important class of numerical programs, achieving speedups over conventionally compiled code that range from 7 times faster to 91 times faster.

  • Rajeev Surati built upon Berlin's work to create a user-extensible partial evaluator. The viability of this approach was demonstrated by applying it to the Stormer integrator on a 6-body problem, achieving a factor of 610 speedup over conventional compilation.

  • In cooperation with Hewlett-Packard, we constructed an 8-processor prototype of a Supercomputer Toolkit -- a technology for configuring a variety of multiprocessor computing instruments that achieve supercomputer-performance computation.

  • Panayotis Skordos demonstrated an effective approach to simulating subsonic fluid dynamics on a cluster of non-dedicated workstations. Using this, he implemented the first plausible simulation of the interaction between subsonic hydrodynamic flow and acoustic waves. Skordos' simulation achieve 80% parallel efficiency (speedup/processors) using 20 HP-Apollo workstations. For this work, Skordos won the 1995 Gordon Bell Prize Prize for best price-performance in supercomputing applications.

    Computations of Scientific Importance

  • Gerry Sussman and Jack Wisdom used the the Supercomputer Toolkit and the associated compilation technology to perform a breakthrough computation in solar-system dynamics, integrating the complete Solar System, with General Relativity and Earth-Moon quadrupole correction, for 100 million years. The longest previous such integration was for 3 million years. This was a computation of major scientific significance, because it apparently confirmed that the motion of the Solar System is chaotic. A report on this analysis was published in Science (July 1992), which devoted an editorial to the significance of this achievement.

    Computational Formulations of Scientific Knowledge

  • Gerry Sussman, Jack Wisdom, Hardy Mayer, Chris Hanson, and Hal Abelson have developed the Scheme numerical and symbolic library to the point where it can support a novel approach to variational mechanics. Expressing the methods of variational mechanics in a computer language forces them to be unambiguous and computationally effective. This approach and software are now being used in a graduate subject in physics at MIT, and we are preparing a textbook based on this material.

  • Does knowledge of language consist of symbolic rules? How are the signals of our perception related to the symbols in our minds? How do children learn and use their linguistic knowledge? In an attempt to elucidate these questions Kenneth Yip and Gerald Jay Sussman presented a computational model of the acquisition and use of phonological knowledge. Their model encapsulates phonological information as boolean constraint relations operating on the classical linguistic representations of speech sounds in term of distinctive features. The performance model is described as a hardware mechanism that incrementally enforces the constraints. Phonological behavior arises from the action of this mechanism. Constraints are induced from a corpus of common English nouns and verbs. The induction algorithm compiles the corpus into increasingly sophisticated constraints. The algorithm is incremental, greedy, and fast. It yields one-shot learning from a few examples. Their model has been implemented as a computer program. The program exhibits phonological behavior similar to that of young children. As a bonus the constraints that are acquired can be interpreted as classical linguistic rules.

    Mathematics of Dynamical Systems

  • Thanos Siapas discovered a new invariant that quantifies the dependence of global geometry of dynamical systems on parameter variations. This is one of the first attempts to characterize fine changes in the geometry of phase-space structures, as opposed to coarse changes in their topology. The invariant determines a power law governing the change in basin geometry as parameters very.

  • Thanos Siapas also discovered a deep relationship between phase transitions in physical systems and the behavior of parallel methods in numerical optimization.

    When hunting for a minimum of a function a search algorithm makes a series of local changes, each of which is intended to get closer to the minimum. Of course, the search can get stuck in a bad local minimum. Various strategies, such as simulated annealing, have been employed to prevent this behavior by allowing some updates which actually move away from the minimum. This contrarian tendency is usually characterized by a "temperature" parameter.

    When changes are made in parallel there may be conflicts, so that even if each change is intended to go downhill there may be an increase in the function to be minimized. Thus there is a conflict between the downhill trend and the increases caused by parallelism. Siapas discovered that this conflict behaves in a way analogous to a phase transition in a physical system: as the amount of parallelism increases the performance of the optimization algorithm improves until a critical amount of parallelism is reached, and with more parallelism the performance decreases rapidly to random search. However, Siapas shows that the variation of the performance with the parallelism parameter is quite unlike the variation of the performance with the temperature parameter.

    Siapas reported on numerical experiments establishing this novel phenomenon in both NK landscapes and in the traveling-salesman problem. He also provided a theoretical framework based on finite-size scaling and on mean-field approximations to shed light on his discovery.

  • Elmer Hung discovered two properties of dynamical systems that are helpful in performing parameter estimation in unstable dynamical systems. First, although most observations in a time series provide little information about the parameters, there are portions of the trajectory that are extremely sensitive to parameter variations. Second, although the phenomenon of "shadowing" causes trouble in parameter estimation, there is an asymmetry of shadowing behavior that provides a handle on the parameter values. Using these properties, Hung devised an algorithm that achieves accuracies several orders of magnitude better than the Kalman filter and has good convergence properties for large data sets.

    Computing Languages and Systems

  • Bill Rozas introduced the idea of translucent procedures. These are like ordinary procedures in all ways, except that their opacity is violated in a controlled way to allow structure matching against a representation of the procedure text and environment. Rozas implemented a system based on translucent procedures and used it to demonstrate a novel equation-solving system, a compiler for threaded interpreters, and a method for the automatic construction of scientific subroutines to compute special functions.

  • Jonathan Rees demonstrated how the call-by-value lambda-calculus, with a few simple extensions, can provide the complete basis for an operating-system security kernel. Each agent (user or subsystem) has a separate evaluation environment that holds objects representing privileges granted to that agent. Because environments ultimately determine availability of object references, protection and sharing can be controlled largely by the way in which environments are constructed. To demonstrate the feasibility of this approach with the Rees used Scheme 48 to build a programming environment for mobile robots, and a secure multi-user environment that runs on workstations. Olin Shivers has exploited these same ideas to develop a Scheme-based server for the World-Wide Web.

  • Brian LaMacchia invented the Internet Fish, a novel resource-discovery tool designed to help users extract and refine useful information from the rich ore seams of the Internet. Unlike monolithic indexing schemes, Mr. LaMacchia's fish are sophisticated persistent autonomous information agents that are intended to model the behavior of research librarians. A user may deploy a fish intended to develop information on a particular topic. The fish will initiate research on that topic, continue to discover new sources of information on that topic, and will keep tabs on new developments in that topic. From time to time the user will interact with his fish, as it swims through cyberspace, to find out what it has learned, and to make suggestions for guidance.

    The internet fish is an entirely new kind of "life" on the network. It differs from a search engine or indexing system in that it is personal, persistent, and autonomous. It explores, maintaining internal state, accumulating information, and interacting with the user in extended conversations. It incorporates deep structural knowledge of the organization of the net; it includes explict knowledge of human naming conventions; and it is capable of on-the-fly reconfiguration, modification and expansion. A human user may specify or dynamically change heuristic policies to be used to help determine how interesting a datum is, and the fish may make these changes itself. Because the fish maintains a model and representation of its own structure and behavior, as well as a model of its information environment and its user, a fish can do meta-level reasoning. Thus LaMacchia's internet fish display the signs of a feeble sort of consciousness and a weak symbiotic relationship with the users.

    But LaMacchia did more. He did not just make a fish, he made an Internet Fish Construction Kit. This is enabling technology encapsulating a variety of useful heuristic and technical information into a catalog of mix-and-match components for fish-like systems. This is important because the separation of the fish into almost orthogonal components gives us a crude epistemology of the network. It also allows one to add interfaces to new services easily and immediately incorporate them into a fish.

    Recognition for Research and Education

  • Over the last five years, five of our recent graduates have received National Young Investigator awards, largely based on their work on the development and application of intelligent simulation technology. Jack Wisdom was selected as a MacArthur Fellow, partly on the basis of work done in our group. Gerry Sussman was selected as founding Fellow of the ACM, a Fellow of the American Academy of Arts and Sciences, and was winner of the ACM Karl Karlstrom Outstanding Educator Award in 1991. Hal Abelson was selected as a Fellow of the IEEE, and he is winner of the Abelson is also the winner of the 1995 Taylor L. Booth Education Award given by IEEE Computer Society, cited for his continued contributions to the pedagogy and teaching of introductory computer science.

    Technology Transfer

    We have a played key role in the standardization of Scheme by ANSII and by IEEE (IEEE Std 1178-1990). Scheme is a dialect of Lisp that is well-suited for educational use and for experiments in symbolic computing and parallelism. Our MIT Scheme implementation for Windows and Unix-based systems, which we maintain and distribute over the Internet, includes a high-performance compiler and an integrated text editor. Our research in novel compilation technology, and in mixed numeric/symbolic computation, rests on this substrate.

    We estimate that MIT Scheme is currently used at over 100 sites around the world. It serves as the primary computational vehicle for core computer science courses at many US universities, including MIT, Berkeley, Stanford, Princeton, and Cornell. Many of these courses are based on the textbook Structure and Interpretation of Computer Programs, by Hal Abelson, Gerry Sussman, and Julie Sussman. In 1996, we completed a second edition of this book, which is published by the MIT Press and the McGraw-Hill book company.

    Videotapes based of our lectures, produced by Hewlett Packard Co. have played major roles in industrial training programs at companies including Hewlett Packard, Texas Instruments, Digital Equipment, and Harris Corporation. See

    For information on obtaining Scheme software and documentation, see We have collaborated with Hewlett Packard Co. on a number of specific projects, including the development of the Supercomputer Toolkit, and our work on intelligent structures is being followed up at Xerox PARC.

    Hal Abelson (
    Gerry Sussman (

    Last modified June 29, 1996