Skip to content

FLUID DYNAMICS | Contents | Next

Space Plasma Group Simulates Solar Storms with NPACI Resources


oon after rockets and satellites began orbiting the Earth, scientists realized the importance of studying the region between the Sun and the Earth for interpreting data from satellites and predicting solar storms. While the Earth's magnetic field creates a protective shield, called the magnetosphere, plasma particles from the Sun (electrons, protons, and the nuclei of helium atoms and heavier atoms) can penetrate the magnetosphere. Magnetospheric disturbances can cause vivid auroras, radio interference, power blackouts, navigation problems for ships and airplanes, and spacecraft damage. Researchers in the Space Plasma Simulations Group at UCLA have advanced "space weather" forecasting by modeling the problems using NPACI resources.

Figure 1. Reproducing Past Events

Three-dimensional rendering of the magnetosphere from a sunward X direction, duskward Y direction, northward Z direction perspective. The Earth is the small light blue sphere at the center of the coordinate system axes. The equatorial and meridian planes are color-coded according to the plasma density. Regions of high density are shown in red. The yellow isosurface on the dawn side is a contour of constant density and coincides with the bow shock. The MHD simulation was run using real plasma and field parameters measured by the ISEE-3 spacecraft upstream of the Earth on August 27, 1978. At that time, an interplanetary shock front created surface waves on the Earth's bow shock and compressed the entire magnetosphere.

Beginning in 1980 with a grant from NASA, space physicist Maha Ashour-Abdalla and colleagues at UCLA have conducted research into the physics of energy and plasma transport through the Earth's magnetosphere. "We have used supercomputers at SDSC for our modeling work since 1986," she said. Since July 2000, the group has been working with NPACI's Strategic Applications Collaborations (SAC) staff to improve their simulations. Using comparisons of NASA spacecraft observations with supercomputer simulations of the magnetosphere and its response to the solar wind, the group studies how the magnetosphere reacts to changes in the solar wind.

Top | Contents | Next


The solar wind consists of charged particles that stream out of the Sun at high velocities, pushing against the Earth's magnetic field and stretching the magnetosphere into a long tail on the night side, opposite the Sun. The interaction of the solar wind and the magnetosphere leads to reconnection, in which magnetic field lines intersect and merge. When the solar wind transfers energy to the magnetotail, reconnection induces an ejection of plasma back towards the Earth, depositing energy in the upper atmosphere and creating auroral light. The details of such geomagnetic "substorms" depend on many factors, including the solar wind strength and the direction of the overall magnetic field from the Sun.

"The interactions of solar storms create complex magnetospheric configurations," said physicist Jean Berchem of the UCLA group. "We try to model these events to better understand how the energy from the solar wind is transferred to the Earth's environment. The implications of this transfer of energy can be very dramatic, especially during solar maximum."

In December 2000, the Earth was in the midst of a solar maximum, a peak in the 11-year cycle over which solar activity waxes and wanes. Occasionally, a coronal mass ejection (CME) of high-energy particles from the Sun hits the magnetosphere. Possible consequences of these solar flare-ups include particularly intense precipitation of energetic particles in the auroral regions creating vivid displays of light and colors at high latitudes and overload conditions for power grids on the ground. High-energy particles can also damage satellites and may even be hazardous to astronauts.

"CMEs are a very active research topic," Berchem said. "During solar maximum, large CMEs occur very often, about one or two a month. Coronal ejections have very different sizes and dynamics. Sometimes, they have very little effect on the Earth's magnetosphere, while other times times they can dramatically alter the Earth's space environment, leading to strong geomagnetic storms."

Solar disturbances have caused major problems at the Earth over the last few decades. One of the strongest geomagnetic storms occurred during the solar maximum in March 1989, when several spacecraft had to have their orbits adjusted and a power blackout occurred over eastern Canada. In 1997, a geomagnetic storm caused AT&T to lose its Galaxy IV satellite, which knocked out pagers throughout the United States. Last year, a space shuttle was sent up to boost the International Space Station to a new orbit because of the increased solar activity. "Magnetospheric research has enabled us to recognize precursors to geomagnetic storms," said David Schriver of the UCLA group. "We have about two days' notice when we see a CME or its precursors, before the storm might possibly reach the region of the Earth."

Top | Contents | Next


Figure 2. Simulating an Aurora

These diagrams shows Earth's polar regions as seen from above by satellite and simulations. The solar wind comes in at 12 noon (to the top of each diagram), and each concentric circle from the center indicates a lower latitude by 10 degrees starting from 90 degrees at the center. Satellite data (top) from the Polar spacecraft Visual Imaging System (VIS) camera looking down on the north pole of the Earth from space (courtesy of Louis Frank from the University of Iowa) and an equivalent view from a global MHD simulation (bottom) show the location of an aurora (red). The aurora is seen near midnight on the Earth between about 60 and 70 degrees latitude in both panels. The MHD simulation used upstream solar wind conditions on December 22, 1996, as input.

The Space Plasma Simulations Group use NPACI supercomputers at SDSC to simulate the effects of changes in solar wind conditions on the Earth's environment. "During a solar maximum, many storms occur, changing the entire plasma system surrounding the Earth," Berchem said. "Our research simulates these events using the computer."

"Simulation and modeling of plasmas near the Earth has really benefited from the use of supercomputers," said group leader Ashour-Abdalla. "The continual improvement of the resources at SDSC has certainly changed the fortunes of this group. Our ability to treat multicomponent plasmas realistically has been greatly extended."

The group tries to understand space weather by modeling storms using bothlocal and global simulations. Global magnetohydrodynamic (MHD) simulations display the big picture, while local, particle-in-cell (PIC) simulations of individual particles show more accurate, detailed interactions. Large-scale kinetics (LSK) is an intermediate type of model in which particles are used along with the global fluid field model to determine the entry of particles from the solar wind into the magnetosphere, and in particular penetration of these particles into the near-Earth region of the magnetosphere. The results of these simulations are then compared to actual observations.

"To model large-scale interactions we use data from spacecraft monitoring the solar wind far upstream from Earth as input to our global MHD simulations" Berchem said (Figures 1-2). "Then, we assess the validity of our predictions by comparing the results of the simulations with spacecraft observations from the NASA International Solar Terrestrial Physics (ISTP) program, as well as ground-based data obtained by magnetometer chains and radar supported by the National Science Foundation. We have been very successful in comparing simulated auroral patterns with auroral observations using images from the Visible Image System on board the NASA Polar spacecraft."

Top | Contents | Next


The UCLA group's codes were originally run on the Cray T3E, but they soon found that the IBM SP Blue Horizon enabled the kernels to run two or three times faster. "We are trying to improve our global MHD simulations at each step," said group researcher Vahe Peroomian. "Ultimately, when our simulation code is able to run in real time, if an upstream spacecraft sees energetic particles coming toward Earth, we might be able to predict which satellites could be in danger. Because of our computer simulations, we can model the consequences of a storm."

The UCLA group specializes in the use of multiple simulation methods to study space plasma physics. "In our research, we have MHD (global simulations), and often we use MHD results to run LSK particles," Peroomian said. "We also have the local, PIC calculations. This allows us to study local effects and apply our findings to space weather directly."

"NPACI's Strategic Applications Collaborations (SAC) program helps these researchers to do their science better and faster," said Bob Sinkovits, SAC coordinator and SDSC computational scientist. "As parallel machines become more complicated, it is harder to get good performance without an insider's knowledge of machine particulars. But our researchers should be focused on science instead of computational issues, which is where our staff comes in." The computational scientists in the SAC project come from science and engineering backgrounds, creating a bridge between the scientific and computer worlds.

"The UCLA group's MHD code initially would not run the MPI library on Blue Horizon", Sinkovits said. "So, Dong Ju Choi and Dominic Holland worked on code modification such that it could use MPI on Blue Horizon."

Choi has also succeeded in porting the particle-in-cell simulation code to the Sun Enterprise 10000 at SDSC. "Choi's work has allowed us to speed up the code by a factor of 10," Schriver said. "The collaboration is great because we are interested primarily in the science. We know how the codes work, but in terms of optimization and how to best use available computer resources, we can always use help."

Peroomian has also experienced greater code success since moving his LSK code from an IBM SP at UCLA with 10 to 20 nodes to Blue Horizon, which can run the code on 30 to 80 nodes. "A 30-second job originally took 12 hours on the older machines", he said. "Now, on Blue Horizon, even before Sinkovits' speed-up, the same job would take four minutes. His work sped up the code by a factor of three to four. It took me six months to complete the first run of this code. Now a good run takes less than eight hours."

The group's short-term goal is to reproduce observations by spacecraft and explain magnetospheric phenomena. "Modeling past events lets us assess the accuracy of our models. They help us to determine the physical processes our simulations need to accurately model the interaction of the solar wind with the Earth environment," Berchem said. "Ultimately, our goal is to refine these models so they are accurate enough to forecast weather in space."

In 2000, the UCLA group ran 177 jobs on Blue Horizon between May and November. They are performing jobs on 256 processors that require more than 100 hours of machine time per year.

"Computer time is precious, and it can be difficult to find," Schriver said. "Originally, our codes ran on vector machines. Advances in computational science have evolved to the point where we can't go back to such machines. We would like to see the SAC program continue. There is no other way to get the kind of results we are getting now."


Top | Contents | Next

Project Leader
Maha Ashour-Abdalla

Jean Berchem,
Mostafa El-Alaoui,
Vahe Peroomian,
Robert Richard,
David Schriver

SAC Team
Bob Sinkovits,
Dong-Ju Choi,
Dominic Holland