SDSC Thread Graphic Issue 2, November 2005

RSS RSS Feed (What is this?)

User Services Director:
Anke Kamrath

Subhashini Sivagnanam

Graphics Designer:
Diana Diehl

Application Designer:
Fariba Fana

Scientific Computing Corner:

SDSC Strategic Applications Collaborations Program Helps SCEC Conduct Terascale Earthquake Simulations

—Yifeng Cui

The large-scale TeraShake simulation by scientists working with the Southern California Earthquake Center (SCEC) stretched SDSC resources across the board, presenting unique computational as well as data challenges. SDSC computational experts in the Scientific Computing Applications Group worked closely with SDSU professor Kim Olsen, developer of the Anelastic Wave Model (AWM), and others to port the code to DataStar, SDSC's IBM Power4 platform. The code was enhanced to scale up to thousands of processors for the very large mesh size, which required a large amount of memory. The integrated AWM presented parallel computing issues related to the large simulation, including MPI and MPI I/O performance improvements, single-processor tuning, and optimization.

Photo: Earthquake simulation mesh Special techniques were introduced that reduced the code's memory requirements, making possible the largest and most detailed earthquake simulation to date of the southern San Andreas Fault in California. The Strategic Applications Collaborations (SAC) staff devoted significant effort to ensuring successful implementation of the code integration for the TeraShake simulation, as well as preparing and performing the final production run, which used 240 processors for four days and produced 43 TB of data, stored on DataStar's GPFS parallel file system. The final run, made possible by the enhancements to the code, used a mesh of 3,000 x 1,500 x 400 at 200 m resolution, some 1.8 billion points and ten times the size of previous runs.

SDSC's computational collaboration effort was supported through the NSF-funded SDSC SAC and Strategic Community Collaborations (SCC) programs. The goal of these collaborations is to develop synergy between the academic researchers and SDSC staff, which accelerates the researchers' efforts by helping them use SDSC resources most effectively. This enables new science such as TeraShake on relatively short timescales.

Researchers are selected for SAC collaborations from diverse academic disciplines, including both traditional HPC application fields and new communities. TeraShake is a great example of this kind of collaboration. With the enhanced code which gives increased scalability, performance, and portability, the TeraShake SAC/SCC work has also provided lasting value, with the optimized code now available to the earthquake community for future large-scale simulations.

Photo: TeraShake earthquake simulation
TeraShake 1 earthquake simulations. The figure shows a snapshot from a TeraShake movie depicting the cumulative peak velocity magnitude for two southern San Andreas Fault rupture scenarios, rupturing NW-SE (left) and SE-NW (right). Source: Kim Olsen and Yifeng Cui, movie by Amit Chourasia.

In 2005 the SDSC SAC group continues its support of the SCEC community, with the collaboration leading to resource allocation grants to SCEC of one million CPU hours on NSF Terascale facilities. In collaboration with SDSU professor Kim Olsen and others, SAC staff has incorporated new dynamic rupture features into the TeraShake code and run the first-ever dynamic rupture simulation at TeraShake scale. Photo: Dynamic rupture simulation Previously, most earthquake wave propagation simulations specified the earthquake as a kinematic source (as in TeraShake 1), specifying a velocity series that might be generated by such an earthquake. In contrast, dynamic fault ruptures simulate the effects of friction, including both static and kinetic friction, producing a more realistic source description. By combining dynamic rupture simulations with Terashake 1-scale wave propagation simulations, the researchers plan to perform the most realistic, physics-based earthquake simulations ever, known as TeraShake 2.

As part of the efforts of the TeraGrid Advanced Support for Teragrid Application (ASTA), the SDSC SAC staff helped port the SCEC CyberShake code for seismic hazard analysis onto the Teragrid IA-64 platform. They also designed a strategy for archiving the expected massive 140 TB of simulation output. The CyberShake team has recently completed simulations of hazard curves at the USC and Pasadena sites in the Los Angles basin as a first implementation of this widely-discussed calculation of probabilistic hazard curves for Southern California. What is unique is that the SCEC approach uses physics-based full waveform modeling rather than traditional more approximate attenuation relationships. Seismic Hazard Analysis is an important resource relied on by seismologists, engineering groups, and emergency management personnel to forecast the levels of shaking expected at various sites due to earthquakes. The hazard curve indicates the range of ground motion expected and the probabilities for these ground motions for each site over a period of time.

CyberShake is using the Teragrid as a real Grid environment, performing cross-site runs at SDSC, Argonne, NCSA, etc., and sharing data across the Teragrid using the recently implemented global shared file system, called GPFS-WAN. The outputs are registered into the SRB in the SCEC Digital library. In the coming year, the project plans to expand to calculate hazard curves for 100 sites or more, and for each site will perform two wave propagation simulations at a scale of 400 km by 400 km and 350,000 rupture processing calculations. This will consume 45,000 allocation hours and generate 14 TB of output. The final goal of the project is to develop a comprehensive seismic hazard map for the Southern California region, which will be used as input for improved building codes and hazard estimate studies.

Photo: Data Graph
Hazard Curve Calculator. Source: Philip Maechling, Southern California Earthquake Center

Yifeng Cui is reachable via e-mail at

Did you know ..?

SDSC is using quotas on GPFS on DataStar to control the use of disk space.
When you have exceeded the quota limits, using the default editor "vi" may destroy the file you are editing. Use the alternate editor "vim" in /usr/local/bin on DataStar. Vim provides a warning if the file cannot be saved due to disk quotas exceeded and will not modify the existing file. - Eva Hocks