Skip to content


News from the San Diego Supercomputer Center

March 2008

Friends and Colleagues,

This fall, SDSC will dedicate its new "green" 80,000 square-foot building addition, essentially doubling the size of the Center's facility at UC San Diego. Coinciding with the expansion of its physical space, SDSC also has been working on a new plan to evolve its mission for the next generation - to become a core resource for UCSD and the UC system, in addition to the national community. We already have taken some early steps in this process.

First, we have established a new post - Chief Scientific Officer - to guide the Center's research focus and expertise, and help SDSC integrate even more closely with our community of collaborators. We'll have an announcement about this appointment in the near future.

Next, with colleagues at UCSD and through the University of California system, we are creating campus and system plans and pilots for 21st century research cyberinfrastructure (CI) - a critical driver for the research and education mission of universities in the Information Age. Mike Norman and Phil Papadopoulos from SDSC are leading a broad campus group to develop UCSD research CI - infrastructure that will link and integrate major campus units and help UCSD prototype the "Research University of the Future."

We also have launched a new effort to facilitate partnerships with the private sector. New collaborations between the university and commercial companies in the technology sector are at an all-time high, with a great new raft of industry "compute clouds" and data centers open to the academic community. Ron Hawkins, formerly of Sony, has been hired as SDSC's Director of Industry Relations, to establish relationships with companies that wish to tap into SDSC's deep reservoir of CI tools and expertise. We'll share more about the programs Ron is developing in upcoming SDSC announcements.

Our goal, throughout this process, is to build a next-generation SDSC that not only continues to be a leader in CI expertise and infrastructure, but also pioneers the support of 21st Century research and education with 21st Century tools. Some great examples of this can be found in the Nuggets below.

Until next time.

Fran Berman

9-1-1 Call Analysis Offers Potential Early Warning Emergency System
Researchers from SDSC and Scripps Institution of Oceanography have conducted the first analysis of its kind of 9-1-1 emergency calls, which may be useful in enhancing early warning systems and coordinating first responses. Using nearly three years of 9-1-1 call data from the San Francisco Bay area and more than 20 months of similar data from San Diego County, the researchers developed computer algorithms to set parameters that will visually alert emergency personnel to abnormally high call rates in concentrated areas. The researchers found that these "hotspots," or clusters of activity, were directly correlated to specific events in those areas, such as the San Diego wildfires last fall. While SDSC researchers have been doing retrospective analyses of previously collected data, the "Spatiotemporal Analysis of 9-1-1 Call Stream Data" project is seen as a vital first step to creating visual analyses that will be available in real time and for larger geographic areas, aiding emergency providers in early detection and response for fast-emerging problems during events such as earthquakes, explosions or fires. For more information on the 9-1-1 Call Stream Data project visit SDSC's News Center.
Supercomputer-driven Tool Reveals Hidden Protein Interactions
In the atomic-level landscape of proteins, shape determines the all-important function of these molecules of life. When a protein molecule responsible for Parkinson's binds with the cell membrane, will a new drug candidate interrupt this interaction -- preventing disease progression and protecting the patient? It all depends on the precise geometry and energy of the protein structures. Researcher Igor Tsigelny and colleagues at SDSC and UCSD have developed the MAPAS (Membrane-Associated Protein Assessments) tool which uses SDSC's Blue Gene supercomputer to study how proteins contact cell membranes. This 3-D "virtual molecular world" lets researchers zoom in on key details of this contact process, holding out the promise of new treatments for a wide range of devastating diseases, from Parkinson's and Alzheimer's to kidney disease and cancer. The researchers describe the MAPAS tool in the February 2008 Nature Methods. Coauthors, all at UCSD, include Yuriy Sharikov, Ross Walker, Jerry Greenberg, Valentina Kouznetsova, Sanjay Nigam, Mark Miller, and Eliezer Masliah. Visit SDSC's News Center to learn more about the MAPAS tool.
SDSC Image of Internet Universe Displayed at NY Museum of Modern Art
An image depicting a frozen moment of activity in the Internet universe created using visualization tools at SDSC is part of a special exhibit running through May 2008 at the Museum of Modern Art in New York. Called Design and the Elastic Mind, the MoMA exhibit focuses on examples of successful translations of "disruptive innovation," and the relationship between art, design, and science, particularly the approach to scale. The SDSC image depicts the round-trip times of data packets sent from a web site in Herndon, Virginia, to hundreds of thousands of nodes on the Internet and back again. It was created by Young Hyun and Bradley Huffaker, researchers with SDSC's Cooperative Association for Internet Data Analysis (CAIDA) program. CAIDA is an independent research group dedicated to investigating both the practical and theoretical aspects of the Internet for about 10 years. Visit SDSC's News Center for more information and a full view of the CAIDA image.
Preservation Project Helps Make Michigan Election Data Available
With election season in full swing, Michigan's Department of History, Arts and Libraries has announced the availability of state Precinct Results Databases online at The databases, used to certify and distribute official results of each election, are an accurate record of the political process. This valuable research tool was created by the Michigan Bureau of Elections and transferred to the Archives of Michigan for permanent preservation. To make the data searchable and available for public access it was converted from its original format in a collaboration including the Archives of Michigan and researchers from the DICE group's Sustainable Archives and Library Technologies (SALT) lab at SDSC. The work was part of the Persistent Archives Testbed (PAT), a multi-state project funded by the National Historical Publications and Records Commission to investigate new and effective data grid methods of ensuring preservation and access to electronic data long after the original technology has become obsolete. Michigan elections data from 1992 through 2004 is available online and can be searched by year, county, office, city and township. There are also GIS maps that highlight voting trends by showing counties color-coded according to party majority.
Simulations of Virtual 9.0 Megaquake in Pacific Northwest Created at SDSC
Megathrust earthquakes greater than magnitude 8.0 happen every 4-500 years in the Pacific Northwest, the last one in 1700. To help prepare for the next such earthquake, a team led by SDSU seismologist Kim Olsen and including SDSC and USGS researchers used a supercomputer-powered "virtual earthquake" to calculate for the first time realistic three-dimensional simulations that model the possible impacts of megathrust quakes in the region. The results were not reassuring, as reported in the Journal of Seismology, particularly for residents of downtown Seattle. Ground motion was as high as 3 feet per second in Seattle, with significant velocities in Tacoma, Olympia, Vancouver, and Portland. "We also found that these high ground velocities were accompanied by significant low-frequency shaking, like what you feel in a roller coaster, lasting as long as five minutes - and that's a long time," said Olsen. By raising awareness of major earthquakes these studies can aid preparations such as early warning systems, improved building codes, and other measures. Each of several simulations ran on 2,000 processors for some 80,000 processor-hours on SDSC's DataStar and Blue Gene systems, and also used GPFS-WAN. The research was supported by the NSF, USGS, and SCEC. Visit SDSC's News Center to learn more about the megaquake simulation.

back to top