Scientific Computing Corner:
The Strategic Applications Collaboration (SAC) and Strategic Community Collaboration (SCC) Program at SDSC - An Overview
The Strategic Applications Collaborations (SAC) and the Strategic Community Collaborations (SCC) programs have been in place at SDSC since 1999 and allow SDSC's computational science experts to work closely with computational science researchers for a longer time period that ranges from a few months to a year. The goal of the SAC and SCC programs is to enable users' computational science research in a significant way by utilizing SDSC's hardware, software, and human resources. The SAC and SCC programs span high performance computational, data, and visualization science projects utilizing SDSC resources. As a part of the Cyberinfrastructure Partnership (CIP), SDSC staff also collaborate with NCSA staff on various projects.
SAC/SCC Selection Process
All SAC/SCC projects have the potential for enabling scientific capability for the user. In addition, in order to qualify for the SAC/SCC program, a project must combine some of these criteria:
Example of SAC and SCC projects
Over the past years, SDSC computational experts have worked on SAC and SCC projects in various disciplines such as astronomy, biochemistry, bioinformatics, chemical engineering, chemistry, civil engineering, climate science, geosciences, imaging science, linguistics, mechanical engineering, neuroscience, nuclear medicine, and space physics. In addition to the above projects, SDSC's computational experts continuously gather single processor, parallel scaling, and I/O performance results of various micro benchmarks, benchmark kernels, and applications on SDSC resources. These results provide guidelines to current and future users regarding choice of HPC resources for their applications.
Geoscience SCC: SDSC staff collaborated with researchers from the Southern California EarthquakeConsortium (SCEC) to perform a large scale simulation of a 7.7 magnitude seismic wave propagation on the San Andres fault. Detailed single processor and parallel performance analysis, MPI I/O implementation, and visualization were done as a part of this effort. This simulation ran on 240 processors of the IBM DataStar machine for five days consuming 20,000 hours and generating about 50 terabytes of output data which was used for visualization.
Turbulence SAC: SDSC staff collaborated with Prof. PK Yeung of Gerogia Tech towards the goal of performing Direct Numerical Simulation (DNS) on 40963 grid. A parallel 3D FFT module with 2D domain decomposition was developed, and the scalability is now being tested. The resultant library of 3D FFT with 2D decomposition will be made available for general users. Effort is also underway to port and test scalability of this on SDSC's BlueGene machine.
Bioinformatics SAC: SDSC staff collaborated with Prof. David Baker and his research group from U. Washington. The Rosetta code is a prominent protein structure prediction code. SDSC staff parallelized the code using posix-compliant file locking, and the resulting code scales to a large number of processors. This code is now running on both SDSC and NCSA resources consuming millions of hours. Currently, effort is underway to improve single processor performance of this code.
Image source: http://depts.washington.edu/bakerpg
For more information about the SAC and SCC programs, contact Amit Majumdar via e-mail at email@example.com.