|
MESH
REFINEMENT
CODE
DETAILS
FACING
THE MUSIC
lassical
music lovers have always appreciated orchestral conductors with
great dynamic range. Arturo Toscanini, Fritz Reiner, and Bruno
Walter were renowned for making symphony orchestras both thunder
and whisper. Walter in particular won a reputation for the ability
to serve up the tinkle of a triangle and an actual cannon in Tchaikovskys
1812 Festival Overture. From pianissimo to fortissimo, Bruno Walter
explored the very limits of audibility. And if there is a Bruno
Walter of cosmology, it is surely Michael L. Norman, a physics
professor at UC San Diego.
 |
|

|
|
Figure 1:
Very First Protostellar Object
|
Normans cosmological
structure code, Enzo, can span 12 orders of magnitude in space and
time. It can pursue the slightest gravitational perturbation of
a nearly uniform primal gasthe tinkling triangle of astrophysicsall
the way to the cannon-roaring condensations of gas, crushed into
volumes so much tinier than the initial volume that the ratio can
only be expressed in scientific notation: 10-30. Enzo can follow
the story from the consequences of the Big Bang to the coalescence
of the first star in exquisite detail; indeed, in quadruple
precision, in a manner faithful to the best idea cosmologists
have of the initial physics and chemistry of the process.
With former graduate students Greg Bryan (now a lecturer at Oxford
University) and Tom Abel (now a visiting scientist at Cambridge
University), Norman has submitted a paper on the most recent Enzo
calculations as an entry for this years Gordon Bell Award.
The prize, named after a pioneer in high-performance computing,
is given annually at the Supercomputing conference for the best
performance improvement in a parallel computing application.
Normans group also has written another paper focused on its
results, which will appear in Science in the coming months.
The physics of present-day star formation is complicated, because
the interstellar medium is itself composed of the remnants of previous
generations of stars, including not only hydrogen and helium but
also heavier elements, in abundances important enough to affect
the star-formation process.
In contrast, said Norman, the formation of the
first star takes place in a much simpler environment: the gas is
only hydrogen and helium, and the initial conditions can be precisely
specified by cosmological models. Its a clean initial-value
problemand its the starting point for the formation
of all other structure in the universe, from galaxies to superclusters.
Using the NPACI Blue Horizon supercomputer at SDSC, Bryan, Abel,
and Norman simulated the condensation of the universes first
star (Figure 1).
They began with what most astronomers believe made up the material
composition of the universe: 10 percent ordinary baryonic
mattercomposed of protons, neutrons, and electrons; and 90
percent cold dark matter, which is exotic, nonbaryonic material.
Cold dark matter is a generic name for species of weakly interacting
particles that are coldwith negligible velocity dispersionat
the era when the universe first became matter dominated. The properties
of cold dark matter allow tiny pre-existing fluctuations (the tinkling
triangles of astrophysics, if you will) to grow on all scales without
hindrance.
MESH
REFINEMENT
While the composition
of cold dark matter is unknown, its mass can be estimated and
its role in gravitational condensation can be calculated. The
salient feature of cold dark matter is the power spectrum of its
density fluctuationsthere are more flutes than cannonssuggesting
that cosmic structure is formed bottom-up, by the
gravitational amplification of initially small fluctuations.
So the computational problem becomes: how do you follow a reasonably
large sample of the infant universe as its hydrodynamic perturbations
lead to a collapsing protogalactic object (millions of solar masses)
and, within that, to protostellar (hundreds of solar masses) clouds?
More specifically, how do cosmologists follow all that with the
correct chemistry and thermodynamics in a properly expanding cosmological
space-time continuum?
The gravitational problem alone has been attacked by N-body simulation,
but Enzo is much more ambitious, just as a great Romantic symphony
is more complex than a baroque country dance. Enzos time-dependent
calculation is carried out over the full three dimensions on a
structured, adaptive grid hierarchy that follows the collapsing
protogalaxy and subsequent protostellar cloud to near-stellar
density, starting from primordial fluctuations a few million years
after the Big Bang. Enzo thus combines a number of computational
modules. There is a hydrodynamic solver for the primordial gas,
an N-body (particle) solver for the collisionless cold dark matter,
a special solver specifically for the gravitational field, and
a 12-species chemical reaction solver for the primordial gas chemistry.
We cant do all this on a uniform mesh, said
Norman, because there is no computer large enough to contain
all the spatiotemporal scales we must follow. Instead, the
researchers use structured adaptive mesh refinement, a way of
adding octaves to the cosmic range. While solving the equations
on a uniform grid, the code follows the quality of the solution
and, when necessary, adds an additional fine mesh over any region
that requires enhanced resolution. The finer mesh obtains its
boundary conditions from the coarse mesh. The finer grid is also
used to improve the solution on its parent. As the evolution continues,
the finer mesh may need to be moved, resized, or removed. Even
finer meshes may be required, producing a tree structure that
may continue to any depth.
Enzo spawns a new mesh when any cell accumulates enough mass that
refinement is needed to preserve a given mass resolution in the
solution, or when a minimum length criterion to resolve perturbations
is exceeded (Figure 2).
 |
|
Figure 2:
Another Fine Mesh
(©
2001 Tom Abel; used by permission)
|
Norman pointed out
that a fluctuation containing the mass of the Milky Way galaxy
will collapse by a factor of about 1,000 before it comes into
dynamical equilibrium. A code to follow such a collapse would
need to have a spatial dynamic range of 105. Resolving the formation
of individual starseven very large oneswithin a galaxy-full
of gas would require even more resolution, a spatial dynamic range
on the order of 1020. The work just completed is at a spatial
dynamic range of 1012roughly the ratio of the diameter of
the Earth to that of a human cell.
CODE
DETAILS
Enzo is implemented
in C++, an object-oriented high-level language, with some compute-intensive
kernels in Fortran 77. The object-oriented approach provides two
benefits: encapsulation (a single mesh is the basic object,
or building block) and extensibility (new physics may be added
easily at all levels).
The hard part in running Enzo on a variety of platforms
is parallelization and load balancing, said Robert Harkness,
a computational astrophysicist at SDSC who is working with Norman
to prepare the Enzo code for runs across the TeraGrid. This is
the orchestral equivalent of deciding how many violins, violas,
cellos, and bass viols will make up the string section. We
are using a version running under the Message-Passing Interface,
which allows us to exploit the object-oriented design by distributing
objects over the processors, rather than attempting to distribute
only the smallest grids themselves, said Harkness. The
small subgrids are generally numerous, and many may be quite short-lived,
enabling them to be created and removed on a single processor.
Other innovations include the creation of sterile objects
that contain information about the sizes and locations of grids,
but without the solutions. Each processor can hold the entire
hierarchy of grids as sterile objects as it works the solution
on any grid or grids that are local to the processor, which reduces
communications traffic. Ultimately, however, the process of obtaining
boundary values for the root (largest) grid is nonlocal; the sterile
objects enable the code to pipeline the global communications
so that they are received first where the solution can be examined
first.
Accurate description of the positions of grids and particles within
the problem domain requires the code to work at extended
precision (double, where a single-precision word is 64 bits
long; quadruple on 32-bit words). Fortunately, Harkness
said, Blue Horizon supplies 128-bit arithmetic. The
researchers have developed methods of restricting the 128-bit
work to only those parts of the code that require it, only when
necessary.
FACING
THE MUSIC
While Enzo in its present
form is challenging enough for current platforms, the possibility
of distributing it across the TeraGrid beckonsand together
with that, the necessity to increase the dynamic range of the
code. At present, it begins with what might be called a statistically
significant fraction of the early universea large volume
of isotropic cold dark matter and gasand proceeds to follow
the physics down to protogalactic and protostellar objects.
A key test of the model physics will come when the dynamic range
is extended to the level of the first star itself, as its gravitational
collapse overcomes its radiative expansion, enabling the fusion
of hydrogen into helium with the release of energy in its core.
It is the progress of this reaction that leads to the fusion of
helium into nuclei of heavier elements that eventually influence
the formation of next-generation stars.
It seems that it will be possible, using this method, to
extend the dynamic range on both ends of the spectrum, Norman
said. In orchestral terms, the thunder will be more thunderous
and the whispers will be all but inaudibleand the supercomputer
platforms will be, collectively, the universes largest,
noisiest concert hall. MM 
|
PROJECT LEADER
Michael L. Norman
UCSD
PARTICIPANTS
Tom Abel
Harvard-Smithsonian Center for Astrophysics,
Cambridge University
Greg Bryan
MIT, Oxford University
|