Page images
PDF
EPUB

of this complicated phenomenon and comes to grips with its main effects on real physical systems. As Frank mentions in "Early Work in Numerical Hydrodynamics," the approach taken in turbulence transport theory is in line with Ulam's early insight about modeling turbulence—that one needs to model not the detailed shape of the fluid flow but rather the rate of energy flow from large to smaller and smaller length scales. "Discrete Fluids" by Brosl Hasslacher introduces an alternate and completely novel approach to modeling complex fluid flows. The model is deceptively simple-a set of particles that live on a lattice of discrete points. They hop from point to point at constant speed and when they run into one another they change direction in a way that conserves momentum. That is all there is-but in a miraculous manner still not completely understood this model reproduces the whole spectrum of collective motion in fluids, from smooth to turbulent flow. This lattice gas automaton is a variation of the cellular automata, invented by Ulam and von Neumann. Its behavior embodies one of Ulam's favorite insights-that simple rules are capable of describing arbitrarily complex behavior (maybe even the behavior of the human brain). Although the model is simple, Brosl's three-part article is rich in ideas, suggesting a new paradigm for parallel computing and for modeling many other physical systems usually described by partial differential equations.

At this point the reader may be overwhelmed by the sprawling array of topics and approaches that appear to be part of nonlinear science. Why call it a science instead of many different ones? In the remarkably informative overview "Nonlinear Science-From Paradigms to Practicalities" David Campbell pulls together the common features and methodology that link disparate phenomena into a single new discipline. He identifies three major paradigms of this field-solitons and coherent structures, chaos and fractals, and complex configuration and patterns-and then follows through with many examples of physical and mathematical systems in which these paradigms play a significant role. The latter two paradigms have roots in Ulam's work with Stein, discussed in the mathematics section of "The Ulam Legacy," and all three have roots in the FPU problem, Fermi, Pasta, and Ulam's seminal paper on nonlinear systems. (Excerpts of this paper appear toward the end of the physics section).

The FPU results, apart from leading to the discovery of the soliton by Kruskal and Zabusky, were the first indication that the ergodic hypothesis of statistical mechanics may be wrong. In the last paper of this section Adrian Patrascioui, the 1986 Ulam Scholar at the Laboratory's Center for Nonlinear Studies, describes further results in this direction and speculates that a reconsideration of the ergodic hypothesis may lead to profound changes in our understanding of the meaning of quantum mechanics.

The field of nonlinear science is changing our understanding of the world around us. As experimental mathematics uncovers certain universal features of complex deterministic systems, it also brings us face to face with the limits of their predictability. We hope this section illustrates a remark from Ulam's 1984 preface, that: "...a mathematical turn of mind, a mathematical habit of thinking, a way of looking at problems in different subfields [of science] ... can suggest general insights and not just offer the mere use of techniques."

[merged small][merged small][graphic][subsumed][subsumed][subsumed]

T

he year was 1945. Two earthshaking events took place: the successful test at Alamogordo

and the building of the first electronic computer. Their combined impact was to modify qualitatively the nature of global interactions between Russia and the West. No less perturbative were the changes wrought in all of academic research and in applied science. On a less grand scale these events brought about a renascence of a mathematical technique known to the old guard as statistical sampling; in its new surroundings and owing to its nature, there was no denying its new name of the Monte Carlo method.

This essay attempts to describe the details that led to this renascence and the roles played by the various actors. It is appropriate that it appears in an issue dedicated to Stan Ulam.

Some Background

Most of us have grown so blasé about computer developments and capabilities. even some that are spectacular-that it is difficult to believe or imagine there was a time when we suffered the noisy, painstakingly slow, electromechanical devices that chomped away on punched cards. Their saving grace was that they continued working around the clock, except for maintenance and occasional repair (such as removing a dust particle from relay gap). But these machines helped enormously with the routine, relatively simple calculations that led to Hiroshima.

The ENIAC. During this wartime period, a team of scientists, engineers, and technicians was working furiously on the

first electronic computer-the ENIACat the University of Pennsylvania in Philadelphia. Their mentors were Physicist First Class John Mauchly and Brilliant Engineer Presper Eckert. Mauchly, familiar with Geiger counters in physics laboratories, had realized that if electronic circuits could count, then they could do arithmetic and hence solve, inter alia, difference equations-at almost incredible speeds! When he'd seen a seemingly limitless array of women cranking out firing tables with desk calculators, he'd been inspired to propose to the Ballistics Research Laboratory at Aberdeen that an electronic computer be built to deal with these calculations.

John von Neumann, Professor of Mathematics at the Institute for Advanced Study, was a consultant to Aberdeen and to Los Alamos. For a whole host of

reasons, he had become seriously interested in the thermonuclear problem being spawned at that time in Los Alamos by a friendly fellow-Hungarian scientist, Edward Teller, and his group. Johnny (as he was affectionately called) let it be known that construction of the ENIAC was nearing completion, and he wondered whether Stan Frankel and I would be interested in preparing a preliminary computational model of a thermonuclear reaction for the ENIAC. He felt he could convince the authorities at Aberdeen that our problem could provide a more exhaustive test of the computer than mere firing-table computations. (The designers of the ENIAC had wisely provided for the capability of much more ambitious versions of firing tables than were being arduously computed by hand, not to mention other quite different applications.) Our response to von Neumann's suggestion was enthusiastic, and his heuristic arguments were accepted by the authorities at Aberdeen.

In March, 1945, Johnny, Frankel, and I visited the Moore School of Electrical Engineering at the University of Pennsylvania for an advance glimpse of the ENIAC. We were impressed. Its physical size was overwhelming-some 18,000 double triode vacuum tubes in a system with 500,000 solder joints. No one ever had such a wonderful toy!

The staff was dedicated and enthusiastic; the friendly cooperation is still remembered. The prevailing spirit was akin to that in Los Alamos. What a pity that a war seems necessary to launch such revolutionary scientific endeavors. The components used in the ENIAC were jointarmy-navy (JAN) rejects. This fact not only emphasizes the genius of Eckert and Mauchly and their staff, but also suggests that the ENIAC was technically realizable even before we entered the war in December, 1941.

After becoming saturated with indoctrination about the general and detailed structure of the ENIAC, Frankel and I returned to Los Alamos to work on a model

that was realistically calculable. (There was a small interlude at Alamogordo!) The war ended before we completed our set of problems, but it was agreed that we continue working. Anthony Turkevich joined the team and contributed substantially to all aspects of the work. Moreover, the uncertainty of the first phase of the postwar Los Alamos period prompted Edward Teller to urge us not only to complete the thermonuclear computations but to document and provide a critical review of the results.

[graphic]

The Spark. The review of the ENIAC results was held in the spring of 1946 at Los Alamos. In addition to Edward Teller, the principals included Enrico Fermi, John von Neumann, and the Director, Norris Bradbury. Stanley Frankel, Anthony Turkevich, and I described the ENIAC, the calculations, and the conclusions. Although the model was relatively simple, the simplifications were taken into account and the extrapolated results were cause for guarded optimism about the feasibility of a thermonuclear weapon.

Among the attendees was Stan Ulam, who had rejoined the Laboratory after a brief time on the mathematics faculty at the University of Southern California. Ulam's personality would stand out in any community, even where "characters" abounded. His was an informal nature; he would drop in casually, without the usual amenities. He preferred to chat, more or less at leisure, rather than to dissertate. Topics would range over mathematics, physics, world events, local news, games of chance, quotes from the classics-all treated somewhat episodically but always with a meaningful point. His was a mind ready to provide a critical link.

During his wartime stint at the Laboratory, Stan had become aware of the electromechanical computers used for implosion studies, so he was duly impressed, along with many other scientists, by the speed and versatility of the ENIAC. In ad

Stanislaw Ulam

dition, however, Stan's extensive mathematical background made him aware that statistical sampling techniques had fallen into desuetude because of the length and tediousness of the calculations. But with this miraculous development of the ENIAC-along with the applications Stan must have been pondering-it occurred to him that statistical techniques should be resuscitated, and he discussed this idea with von Neumann. Thus was triggered the spark that led to the Monte Carlo method.

The Method

The spirit of this method was consistent with Stan's interest in random processes-from the simple to the sublime. He relaxed playing solitaire; he was stimulated by playing poker; he would cite the times he drove into a filled parking lot at the same moment someone was accommodatingly leaving. More seriously, he created the concept of "lucky numbers," whose distribution was much like that of prime numbers; he was intrigued by the theory of branching processes and

contributed much to its development, including its application during the war to neutron multiplication in fission devices. For a long time his collection of research interests included pattern development in two-dimensional games played according to very simple rules. Such work has lately emerged as a cottage industry known as cellular automata.

John von Neumann saw the relevance of Ulam's suggestion and, on March 11, 1947, sent a handwritten letter to Robert Richtmyer, the Theoretical Division leader (see "Stan Ulam, John von Neumann, and the Monte Carlo Method"). His letter included a detailed outline of a possible statistical approach to solving the problem of neutron diffusion in fissionable material.

Johnny's interest in the method was contagious and inspiring. His seemingly relaxed attitude belied an intense interest and a well-disguised impatient drive. His talents were so obvious and his cooperative spirit so stimulating that he garnered the interest of many of us. It was at that time that I suggested an obvious name for the statistical method-a suggestion not unrelated to the fact that Stan had an uncle who would borrow money from relatives because he "just had to go to Monte Carlo." The name seems to have endured.

The spirit of Monte Carlo is best conveyed by the example discussed in von Neumann's letter to Richtmyer. Consider a spherical core of fissionable material surrounded by a shell of tamper material. Assume some initial distribution of neutrons in space and in velocity but ignore radiative and hydrodynamic effects. The idea is to now follow the development of a large number of individual neutron chains as a consequence of scattering, absorption, fission, and escape.

At each stage a sequence of decisions has to be made based on statistical probabilities appropriate to the physical and geometric factors. The first two decisions occur at time t = 0, when a neutron is selected to have a certain velocity and a cer

tain spatial position. The next decisions are the position of the first collision and the nature of that collision. If it is determined that a fission occurs, the number of emerging neutrons must be decided upon, and each of these neutrons is eventually followed in the same fashion as the first. If the collision is decreed to be a scattering, appropriate statistics are invoked to determine the new momentum of the neu

John von Neumann

tron. When the neutron crosses a material boundary, the parameters and characteristics of the new medium are taken into account. Thus, a genealogical history of an individual neutron is developed. The process is repeated for other neutrons until a statistically valid picture is generated.

Random Numbers. How are the various decisions made? To start with, the computer must have a source of uniformly distributed psuedo-random numbers. A much used algorithm for generating such numbers is the so-called von Neumann "middle-square digits." Here, an arbitrary n-digit integer is squared, creating a 2n-digit product. A new integer is formed by extracting the middle n-digits from the product. This process is iterated over and over, forming a chain

of integers whose properties have been extensively studied. Clearly, this chain of numbers repeats after some point. H. Lehmer has suggested a scheme based on the Kronecker-Weyl theorem that generates all possible numbers of n digits before it repeats. (See "Random-Number Generators" for a discussion of various approaches to the generation of random numbers.)

Once one has an algorithm for generating a uniformly distributed set of random numbers, these numbers must be transformed into the nonuniform distribution g desired for the property of interest. It can be shown that the function f needed to achieve this transformation is just the inverse of the nonuniform distribution function, that is, f = g. For example, neutron physics shows us that the distribution of free paths—that is, how far neutrons of a given energy in a given material go before colliding with a nucleus-decreases exponentially in the interval (0,∞). If x is uniformly distributed in the open interval (0, 1), then flnx will give us a nonuniform distribution g with just those properties.

The reader will appreciate many of the advantages of the Monte Carlo method compared to the methods of differential equations. For example, a neutronvelocity spectrum with various peaks and valleys is difficult to handle mathematically. For Monte Carlo one needs only to mirror the velocity spectrum in the probability distribution. Also, the Monte Carlo method is sufficiently flexible to account for hydrodynamic effects in a selfconsistent way. In an even more elaborate code, radiation effects can be dealt with by following the photons and their interactions (see "Monte Carlo at Work").

Clearly, applications of the Monte Carlo method are much broader than so far outlined. (Although I emphasize the use of Monte Carlo in the study of physical systems, random sampling is also an efficient way to evaluate complicated and many-dimensional integrals. For an

[graphic]

example, see the section entitled "The Monte Carlo Method" in "A Primer on Probability, Measure, and the Laws of Large Numbers.") Since its inception, many international conferences have been held on the various applications of the method. Recently, these range from the conference, "Monte Carlo Methods and Applications in Neutronics, Photonics, and Statistical Physics," at Cadarache Castle, France, in the spring of 1985 to the latest at Los Alamos, "Frontiers of Quantum Monte Carlo," in September, 1985.

Putting the Method into Practice

Let me return to the historical account. In late 1947 the ENIAC was to be moved to its permanent home at the Ballistics Research Laboratory in Maryland. What a gargantuan task! Few observers were of the opinion that it would ever do another multiplication or even an addition. It is a tribute to the patience and skill of Josh Gray and Richard Merwin, two fearless uninitiates, that the move was a success. One salutary effect of the interruption for Monte Carlo was that another distinguished physicist took this occasion to resume his interest in statistical studies. Enrico Fermi helped create modern physics. Here, we focus on his interest in neutron diffusion during those exciting times in Rome in the early thirties. According to Emilio Segrè, Fermi's student and collaborator, "Fermi had invented, but of course not named, the present Monte Carlo method when he was studying the moderation of neutrons in Rome. He did not publish anything on the subject, but he used the method to solve many problems with whatever calculating facilities he had, chiefly a small mechanical adding machine."*

In a recent conversation with Segrè, I

*quoted with permission of W. H. Freeman and Company from From X-Rays to Quarks by Emilio Segré.

learned that Fermi took great delight in astonishing his Roman colleagues with his remarkably accurate, "too-good-to-believe" predictions of experimental results. After indulging himself, he revealed that his "guesses" were really derived from the statistical sampling techniques that he used to calculate with whenever insomnia struck in the wee morning hours! And so it was that nearly fifteen years earlier, Fermi had independently developed the Monte Carlo method.

It was then natural for Fermi, during the hiatus in the ENIAC operation, to dream up a simple but ingenious analog device to implement studies in neutron transport. He persuaded his friend and collaborator Percy King, while on a hike one Sunday morning in the mountains surrounding Los Alamos, to build such an instrument-later affectionately called the FERMIAC (see the accompanying photo).

The FERMIAC developed neutron genealogies in two dimensions, that is, in a plane, by generating the site of the "next collision." Each generation was based on a choice of parameters that characterized the particular material being tra

versed. When a material boundary was crossed, another choice was made appropriate to the new material. The device could accommodate two neutron energies, referred to as "slow" and "fast." Once again, the Master had just the right feel for what was meaningful and relevant to do in the pursuit of science.

The First Ambitious Test. Much to the amazement of many "experts," the ENIAC survived the vicissitudes of its 200-mile journey. In the meantime Richard Clippinger, a staff member at Aberdeen, had suggested that the ENIAC had sufficient flexibility to permit its controls to be reorganized into a more convenient (albeit static) stored-program mode of operation. This mode would have a capacity of 1800 instructions from a vocabulary of about 60 arithmetical and logical operations. The previous method of programming might be likened to a giant plugboard, that is to say, to a can of worms. Although implementing the new approach is an interesting story, suffice it to say that Johnny's wife, Klari, and I designed the new controls in about two months and completed the implementation in a fortnight. We then had the opportunity of using the ENIAC for the first ambitious test of the Monte Carlo method a variety of problems in neutron transport done in collaboration with Johnny.

Nine problems were computed corresponding to various configurations of materials, initial distributions of neutrons, and running times. These problems, as yet, did not include hydrodynamic or radiative effects, but complex geometries and realistic neutron-velocity spectra were handled easily. The neutron histories were subjected to a variety of statistical analyses and comparisions with other approaches. Conclusions about the efficacy of the method were quite favorable. It seemed as though Monte Carlo was here to stay.

Not long afterward, other Laboratory

[graphic]
« PreviousContinue »