SDSC Strategic Application Collaboration Helps SCEC seismologists enabling large-scale TeraShake Simulations
PI: Tom Jordan (USC) , Kim Olsen (SDSU), Jean Bernard Minster (SIO), Carl Kesselman (USC), Reagan Moore (SDSC)
SAC Staff: Yifeng Cui (SDSC), Amit Chourasia (SDSC)
The large-scale TeraShake simulation by scientists working with the Southern California Earthquake Center (SCEC) stretched SDSC resources across the board, presenting unique computational as well as data challenges.
SDSC computational experts in the Scientific Computing Applications Group worked closely with SDSU professor Kim Olsen, developer of the Anelastic Wave Model (AWM), and others to port the code to DataStar, SDSC's IBM Power4 platform. The code was enhanced to scale up to thousands of processors for the very large mesh size, which required a large amount of memory. The integrated AWM presented parallel computing issues related to the large simulation, including MPI and MPI I/O performance improvements, single-processor tuning, and optimization.
Figure 1. A snapshot from a TeraShake movie depicting the cumulative peak velocity magnitude for two southern San Andreas Fault rupture scenarios, rupturing NW-SE (left) and SE-NW (right). Source: Kim Olsen and Yifeng Cui, movie by Amit Chourasia
Special techniques were introduced that reduced the code's memory requirements, making possible the largest and most detailed earthquake simulation to date of the southern San Andreas Fault in California. The Strategic Applications Collaborations (SAC) staff devoted significant effort to ensuring successful implementation of the code integration for the TeraShake simulation, as well as preparing and performing the final production run. The final run used 240 processors for four days and produced 43 TB of data, stored on DataStar's GPFS parallel file system. It was made possible by the enhancements to the code, and used a mesh of 3,000 x 1,500 x 400 at 200 m resolution — a total of approximately 1.8 billion points and weighing in at ten times the size of previous runs.
Figure 2. In TeraShake-2, instantaneous movement in the fault-parallel x direction, 110 seconds after the start of the northwest-moving rupture on the San Andreas Fault near the Salton Sea. Note the continued shaking in the sediment-filled Los Angeles basin well after the initial earthquake waves have passed. Simulation: SCEC scientists Kim Olsen, Steven Day, SDSU et al; Yifeng Cui et al, SDSC/UCSD. Visualization: Amit Chourasia, SDSC/UCSD.
In 2005 the SDSC SAC group continues its support of the SCEC community, with the collaboration leading to resource allocation grants to SCEC of one million CPU hours on NSF Terascale facilities. In collaboration with SDSU professor Kim Olsen and others, SAC staff has incorporated new dynamic rupture features into the TeraShake code and ran the dynamic rupture simulation at TeraShake scale. Previously, most earthquake wave propagation simulations specified the earthquake as a kinematic source (as in TeraShake 1), specifying a velocity series that might be generated by such an earthquake. In contrast, dynamic fault ruptures simulate the effects of friction, including both static and kinetic friction, producing a more realistic source description. By combining dynamic source with Terashake 1-scale wave propagation simulations, the physics-based earthquake simulations ever, called TeraShake 2, ran at 100 m resolution with a mesh of 2992 x 800 x 400, for 35 hours on 1024 Teragrid IA-64 processors to prepare the dynamic sources, then ran at 200 m resolution for four days on 240 processors of the newly-expanded 15.6 Teraflops DataStar. The 10 terabytes of output data is archived in the SCEC Digital Library, managed by the SDSC Storage Resource Broker (SRB) at SDSC, where it is easily available to researchers for further analysis.
The Terashake SAC/SCC work provides lasting value with enhanced code that gives increased scalability, performance, and portability. This optimized code is now available to the entire earthquake community for future large-scale simulations.