Page images
PDF
EPUB
[graphic]

THE FERMIAC

The Monte Carlo trolley, or FERMIAC, was invented by Enrico Fermi and constructed by Percy King. The drums on the trolley were set according to the material being traversed and a random choice between fast and slow neutrons. Another random digit was used to determine the direction of motion, and a third was selected to give the distance to the next collision. The trolley was then operated by moving it across a twodimensional scale drawing of the nuclear device or reactor assembly being studied. The trolley drew a path as it rolled, stopping for changes in drum settings whenever a material boundary was crossed. This infant computer was used for about two years to determine, among other things, the change in neutron population with time in numerous types of nuclear systems.

staff members made their pilgrimages to ENIAC to run Monte Carlo problems. These included J. Calkin, C. Evans, and F. Evans, who studied a thermonuclear problem using a cylindrical model as well as the simpler spherical one. B. Suydam and R. Stark tested the concept of artificial viscosity on time-dependent shocks; they also, for the first time, tested and found satisfactory an approach to hydrodynamics using a realistic equation of state in spherical geometry. Also, the distinguished (and mysterious) mathematician C. J. Everett was taking an interest in Monte Carlo that would culminate in a series of outstanding publications in collaboration with E. Cashwell. Meanwhile, Richtmyer was very actively running Monte Carlo problems on the socalled SSEC during its brief existence at IBM in New York.

In many ways, as one looks back, it was among the best of times.

Rapid Growth. Applications discussed in the literature were many and varied and spread quickly. By midyear 1949 a

symposium on the Monte Carlo method, sponsored by the Rand Corporation, the National Bureau of Standards, and the Oak Ridge Laboratory, was held in Los Angeles. Later, a second symposium was organized by members of the Statistical Laboratory at the University of Florida in Gainesville.

In early 1952 a new computer, the MANIAC, became operational at Los Alamos. Soon after Anthony Turkevich led a study of the nuclear cascades that result when an accelerated particle collides with a nucleus. The incoming particle strikes a nucleon, experiencing either an elastic or an ineleastic scattering, with the latter event producing a pion. In this study particles and their subsequent collisions were followed until all particles either escaped from the nucleus or their energy dropped below some threshold value. The "experiment" was repeated until sufficient statistics were accumulated. A whole series of target nuclei and incoming particle energies was examined.

Another computational problem run on the MANIAC was a study of equations

of state based on the two-dimensional motion of hard spheres. The work was a collaborative effort with the Tellers, Edward and Mici, and the Rosenbluths, Marshall and Arianna (see "Monte Carlo at Work"). During this study a strategy was developed that led to greater computing efficiency for equilibrium systems obeying the Boltzmann distribution function. According to this strategy, if a statistical "move" of a particle in the system resulted in a decrease in the energy of the system, the new configuration was accepted. On the other hand, if there was an increase in energy, the new configuration was accepted only if it survived a game of chance biased by a Boltzmann factor. Otherwise, the old configuration became a new statistic.

It is interesting to look back over twoscore years and note the emergence, rather early on, of experimental mathematics, a natural consequence of the electronic computer. The role of the Monte Carlo method in reinforcing such mathematics seems self-evident. When display units were introduced, the temptation to exper

iment became almost irresistible, at least for the fortunate few who enjoyed the luxury of a hands-on policy. When sharedtime operations became realistic, experimental mathematics came of age. At long last, mathematics achieved a certain parity-the twofold aspect of experiment and theory-that all other sciences enjoy.

It is, in fact, the coupling of the subtleties of the human brain with rapid and reliable calculations, both arithmetical and logical, by the modern computer that has stimulated the development of experimental mathematics. This development will enable us to achieve Olympian heights.

The Future

So far I have summarized the rebirth of statistical sampling under the rubric of Monte Carlo. What of the futureperhaps even a not too distant future?

The miracle of the chip, like most miracles, is almost unbelievable. Yet the fantastic performances achieved to date have not quieted all users. At the same time we are reaching upper limits on the computing power of a single processor.

One bright facet of the miracle is the lack of macroscopic moving parts, which makes the chip a very reliable bit of hardware. Such reliability suggests parallel processing. The thought here is not a simple extension to two, or even four or eight, processing systems. Such extensions are adiabatic transitions that, to be sure, should be part of the immediate, short-term game plan. Rather, the thought is massively parallel operations with thousands of interacting processors

even millions!

Already commercially available is one computer, the Connection Machine, with 65,536 simple processors working in parallel. The processors are linked in such a way that no processor in the array is more than twelve wires away from another and the processors are pairwise connected by a number of equally efficient

routes, making communication both flexible and efficient. The computer has been used on such problems as turbulent fluid flow, imaging processing (with features analogous to the human visual system), document retrieval, and "common-sense" reasoning in artificial intelligence.

One natural application of massive parallelism would be to the more ambitious Monte Carlo problems already upon us. To achieve good statistics in Monte Carlo calculations, a large number of "histories" need to be followed. Although each history has its own unique path, the underlying calculations for all paths are highly parallel in nature.

Still, the magnitude of the endeavor to compute on massively parallel devices must not be underestimated. Some of the tools and techniques needed are:

• A high-level language and new architecture able to deal with the demands of such a sophisticated language (to the relief of the user);

• Highly efficient operating systems and compilers;

• Use of modern combinatorial theory, perhaps even new principles of logic, in the development of elegant, comprehensive architectures;

A fresh look at numerical analysis and the preparation of new algorithms (we have been mesmerized by serial computation and purblind to the sophistication and artistry of parallelism).

Where will all this lead? If one were to wax enthusiastic, perhaps just perhaps a simplified model of the brain might be studied. These studies, in turn, might provide feedback to computer architects designing the new parallel struc

tures.

Such matters fascinated Stan Ulam. He often mused about the nature of memory and how it was implemented in the brain. Most important, though, his own brain possessed the fertile imagination needed to make substantive contributions to the very important pursuit of understanding intelligence.■

Further Reading

S. Ulam, R. D. Richtmyer, and J. von Neumann. 1947. Statistical methods in neutron diffusion. Los Alamos Scientific Laboratory report LAMS-551. This reference contains the von Neumann letter discussed in the present article.

N. Metropolis and S. Ulam. 1949. The Monte Carlo method. Journal of the American Statistical Association 44:335-341.

S. Ulam. 1950. Random processes and transformations. Proceedings of the International Congress of Mathematicians 2:264-275.

Los Alamos Scientific Laboratory. 1966. Fermi invention rediscovered at LASL. The Atom, October, pp. 7-11.

C. C. Hurd. 1985. A note on early Monte Carlo computations and scientific meetings. Annals of the History of Computing 7:141-155.

W. Daniel Hillis. 1987. The connection machine. Scientific American, June, pp. 108-115.

N. Metropolis received his B.S. (1937) and his Ph.D. (1941) in physics at the University of Chicago. He arrived in Los Alamos, April 1943, as a member of the original staff of fifty scientists. After the war he returned to the faculty of the University of Chicago as Assistant Professor. He came back to Los Alamos in 1948 to form the group that designed and built MANIAC I and II. (He chose the name MANIAC in the hope of stopping the rash of such acronyms for machine names, but may have, instead, only further stimulated such use.) From 1957 to 1965 he was Professor of Physics at the University of Chicago and was the founding Director of its Institute for Computer Research. In 1965 he returned to Los Alamos where he was made a Laboratory Senior Fellow in 1980. Although he retired recently, he remains active as a Laboratory Senior Fellow Emeritus.

[graphic][subsumed][ocr errors][ocr errors]

Printed in USA

[graphic]
[graphic]
[ocr errors]
[ocr errors]

01 STAN ULAM, JOHN VON NEUMANN, and the MONTE CARLO METHOD

he Monte Carlo method is a statistical sampling technique that over the years has been applied successfully to a vast number of scientific problems. Although the computer codes that implement Monte Carlo have grown ever more sophisticated, the essence of the method is captured in some unpublished remarks Stan made in 1983 about solitaire.

"The first thoughts and attempts I made to practice [the Monte Carlo method] were suggested by a question which occurred to me in 1946 as I was convalescing from an illness and playing solitaires. The question was what are the chances that a Canfield solitaire laid out with 52 cards will come out successfully? After spending a lot of time trying to estimate them by pure

combinatorial calculations, I wondered whether a more practical method than "abstract thinking" might not be to lay it out say one hundred times and simply observe and count the number of successful plays. This was already possible to envisage with the beginning of the new era of fast computers, and I immediately thought of problems of neutron diffusion and other questions of mathematical physics, and more generally how to change processes described by certain differential equations into an equivalent form interpretable as a succession of random operations. Later... [in 1946, I] described the idea to John von Neumann and we began to plan actual calculations."

Von Neumann was intrigued. Statistical sampling was already well known

by Roger Eckhardt

in mathematics, but he was taken by the idea of doing such sampling using the newly developed electronic computing techniques. The approach seemed especially suitable for exploring the behavior of neutron chain reactions in fission devices. In particular, neutron multiplication rates could be estimated and used to predict the explosive behavior of the various fission weapons then being designed.

In March of 1947, he wrote to Robert Richtmyer, at that time the Theoretical Division Leader at Los Alamos (Fig. 1). He had concluded that "the statistical approach is very well suited to a digital treatment," and he outlined in some detail how this method could be used to solve neutron diffusion and multiplication problems in fission devices for the case "of 'inert' criticality" (that is, approximated as momentarily static config

A

[blocks in formation]

Mr. R. Richtmyer

Post Office Box 1668

Santa Fe, New Mexico

Dear Bob:

This is the letter I promised you in the course of our telephone con-
versation on Friday, March 7th.

I have been thinking a good deal about the possibility of using stat-
My present
istical methods to solve neutron diffusion and multiplication problems, in
accordance with the principle suggested by Stan Ulam. The more I think about
this, the more I become convinced that the idea has great merit.
conclusions and expectations can be summarized as follows:

(1) The statistical approach is very well suited to a digital treatment. I
worked out the details of a criticality discussion under the following
conditions:

(a) Spherically symmetric geometry

(b) Variable (if desired, continuously variable)

radius, of active mate

position along the

13

Page Eleven

rial (28 or Be or

=0

f(b).

[merged small][merged small][ocr errors]

and the cards with a., corresponding neutrons that were absorbed within the assembly) those with N+ (1.e., corresponding to neutrons that escaped from the Assembly), may be sorted out.

The manner in which this material can then be used for all kinds of

neutron statistic investigations is obvious.

From the collisio

cross-section

alocity; 1.0/

velocity

I append a tentative "computing sheet" for the calculation above.

It should give a reasonably immediate idea of

It is, of course, neither an actual "computing sheet" for a (human) computer
group, nor a set-up for the ENIAC, but I think that it is well suited to
serve as a basis for either.

the amount of work that is involved in the procedure in question.

I cannot assert this with certainty yet, but it seems to me very

I doubt that the processing of 100 'neutrons'
likely that the instructions given on this "computing sheet" do not exceed the
'logical' capacity of the ENIAC.
will take much longer than the reading, punching and (once) sorting time of

[merged small][merged small][merged small][merged small][ocr errors][ocr errors][ocr errors][merged small][subsumed][subsumed][ocr errors][subsumed][ocr errors][subsumed][subsumed][merged small][merged small][merged small][merged small][merged small][ocr errors][merged small][merged small][ocr errors][subsumed][ocr errors][ocr errors][merged small][merged small][merged small][ocr errors][ocr errors][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small][merged small]

100 cards; 1.e., about 3 minutes.

these stages should take about 300 minutes; 1.e., 5 hours.

Hence, taking 100 'neutrons' through 100 of

[ocr errors]

see (1)

Ni-1

[blocks in formation]

of C.

[ocr errors][ocr errors]

di

[ocr errors]

32

Please let me know what you and Stan think of these things.
approach and the formulation and generality of the criticality problem seem
reasonable to you, or would you prefer some other variant?

Does the

[ocr errors]

5

3-4

[blocks in formation]
[ocr errors]

70.14

With best regards;

Would you consider coming East some time to discuss matters further?
When could this be?

Only for 2
Only for
Only for

(212

5+!

[ocr errors]

[ocr errors]

Very truly yours,

[ocr errors][merged small][ocr errors][merged small][ocr errors][merged small][merged small]

21477

[ocr errors][ocr errors][merged small][merged small][ocr errors][ocr errors]

Neumann

DECT H2

[blocks in formation]

(10)

13

14

[ocr errors]
[blocks in formation]

C,

[blocks in formation]

√13

[ocr errors][merged small]
[blocks in formation]

of Ci

+of

[ocr errors]

,800 (1)

[ocr errors]

4

[merged small][ocr errors]

4800 (2)

ZIEJED

[merged small][merged small][merged small][merged small][ocr errors]

2

[ocr errors]

(v)

Fig. 1. The first and last pages of von Neumann's remarkable letter to Robert Richtmyer are shown above, as well as a portion of his tentative computing sheet. The last illustrates how extensivly von Neumann had applied himself to the details of a neutron-diffusion calculation.

urations). This outline was the first formulation of a Monte Carlo computation for an electronic computing machine.

In his formulation von Neumann used a spherically symmetric geometry in which the various materials of interest varied only with the radius. He assumed that the neutrons were generated isotropically and had a known velocity spectrum and that the absorption, scattering, and fission cross-sections in the fissionable material and any surrounding materials (such as neutron moderators or reflectors) could be described as a function of neutron velocity. Finally, he assumed an appropriate accounting of the statistical character of the number of fission neutrons with probabilities specified for the generation of 2, 3, or 4 neutrons in each fission process.

The idea then was to trace out the history of a given neutron, using random digits to select the outcomes of the various interactions along the way. For example, von Neumann suggested that in the compution "each neutron is represented by [an 80-entry punched computer] card... which carries its characteristics," that is, such things as the zone of material the neutron was in, its radial position, whether it was moving inward or outward, its velocity, and the time. The card also carried "the necessary random values" that were used to determine at the next step in the history such things as path length and direction, type of collision, velocity after scattering-up to seven variables in all. A "new" neutron was started (by assigning values to a new card) whenever the neutron under consideration was scattered or whenever it passed into another shell; cards were started for several neutrons if the original neutron initiated a fission. One of the main quantities of interest, of course, was the neutron multiplication rate-for each of the 100 neutrons started, how many would be present after, say, 10-8 second?

At the end of the letter, von Neumann attached a tentative "computing sheet" that he felt would serve as a basis for

setting up this calculation on the ENIAC. He went on to say that "it seems to me very likely that the instructions given on this 'computing sheet' do not exceed the 'logical' capacity of the ENIAC." He estimated that if a problem of the type he had just outlined required "following 100 primary neutrons through 100 collisions. [each]... of the primary neutron or its descendants," then the calculations would "take about 5 hours." He further stated, somewhat optimistically, that "in changing over from one problem of this category to another one, only a few numerical constants will have to be set anew on one of the 'function table' organs of the ENIAC."

His treatment did not allow "for the displacements, and hence changes of material distribution, caused by hydrodynamics," which, of course, would have to be taken into account for an explosive device. But he stated that "I think that I know how to set up this problem, too: One has to follow, say 100 neutrons through a short time interval At; get their momentum and energy transfer and generation in the ambient matter; calculate from this the displacement of matter; recalculate the history of the 100 neutrons by assuming that matter is in the middle position between its original (unperturbed) state and the above displaced (perturbed) state;... iterating in this manner until a "self-consistent" system of neutron history and displacement of matter is reached. This is the treatment of the first time interval At. When it is completed, it will serve as a basis for a similar treatment of the second time interval... etc., etc."

Von Neumann also discussed the treatment of the radiation that is generated during fission. "The photons, too, may have to be treated 'individually' and statistically, on the same footing as the neutrons. This is, of course, a non-trivial complication, but it can hardly consume much more time and instructions than the corresponding neutronic part. It seems

to me, therefore, that this approach will gradually lead to a completely satisfactory theory of efficiency, and ultimately permit prediction of the behavior of all possible arrangements, the simple ones as well as the sophisticated ones."

And so it has. At Los Alamos in 1947, the method was quickly brought to bear on problems pertaining to thermonuclear as well as fission devices, and, in 1948, Stan was able to report to the Atomic Energy Commission about the applicability of the method for such things as cosmic ray showers and the study of the Hamilton Jacobi partial differential equation. Essentially all the ensuing work on Monte Carlo neutron-transport codes for weapons development and other applications has been directed at implementing the details of what von Neumann outlined so presciently in his 1947 letter (see "Monte Carlo at Work”).

n von Neumann's formulation of the

Ineutror

neutron diffusion problem, each neutron history is analogous to a single game of solitare, and the use of random numbers to make the choices along the way is analogous to the random turn of the card. Thus, to carry out a Monte Carlo calculation, one needs a source of random numbers, and many techniques have been developed that pick random numbers that are uniformly distributed on the unit interval (see "Random-Number Generators"). What is really needed, however, are nonuniform distributions that simulate probability distribution functions specific to each particular type of decision. In other words, how does one ensure that in random flights of a neutron, on the average, a fraction e-*/\ travel a distance x/A mean free paths or farther without colliding? (For a more mathematical discussion of random variables, probability distribution functions, and Monte Carlo, see pages 68-73 of "A Tutorial on Probability, Measure, and the Laws of Large Numbers.")

The history of each neutron is gener

« PreviousContinue »