Skip to content

news
Nuggets

News from the San Diego Supercomputer Center

January 2008


Friends and Colleagues,

As we enter 2008, I want to reflect briefly on some highlights that made 2007 so eventful and why SDSC is such a special place to work. First, I'm proud of the innovative technologies launched during the year by our remarkable staff, including but not limited to: the release of the Open Source iRODS Data Management System by Reagan Moore and his colleagues, which represents a new approach to distributed data management and integration; the launching of user-settable supercomputer reservations, which gives users control over when their programs run; and an on-demand system that gives researchers immediate access to supercomputing time for event-driven computing in emergencies such as earthquakes, major storms, or other disasters.

Beyond our strength as a technology driver, SDSC also used its expertise to help others gain a better link to the Information Age. SDSC participated in several laudatory programs, including a collaboration with the UCSD Communication Department to help kids and young adults in a federally subsidized housing project in Southeast San Diego learn the basics of computers and digital technology; in addition, the award-winning TeacherTech program brought new technology tools and technology-enabled science concepts to more than 1,400 teachers in the greater San Diego region.

The San Diego wildfires tested the mettle of all in the region and I'm gratified to note SDSC's role in helping emergency responders and residents deal with this disaster. The SDSC-hosted Red Cross Safe and Well website -- which grew out of an intense collaboration between SDSC's Chaitan Baru and his colleagues with the Red Cross during Hurricane Katrina -- helped people reconnect during the fires. And Hans-Werner Braun's remote mountaintop cameras, part of his HPWREN project, helped firefighters and the public monitor the path of the various wildfires throughout the county.

SDSC's staff often works as a team with colleagues at UC San Diego and across other institutions. In this spirit of collaboration, during November SDSC shared its booth during the annual Supercomputing '07 meeting with our UC San Diego colleagues from Calit2 and Scripps Institution of Oceanography, in addition to the Electronic Visualization Laboratory in Chicago. The team provided an outstanding mix of compelling demonstrations and talks on what we all have in common: "Building the future with cyberinfrastructure."

In 2008 we look forward to a momentous 12 months ahead. As always, we will continue to innovate, collaborate, mentor and provide service to our community of researchers and friends, and we wish everyone a productive and happy new year.

Sincerely,
Fran Berman


 Building a Global Carbon Accounting System
Sequestering carbon from the atmosphere in growing forests will play a key role in society's efforts to deal with global warming. But many challenges lie ahead - technical, political, and economic - in building an operational system for carbon monitoring and accounting. The Workshop on Global Forest Carbon Measurement, Monitoring, and Accounting, held November 28 and 29 at the National Geographic Society in Washington, D.C., brought together more than 60 world experts to begin the process of understanding current carbon measurement systems and identifying viable paths to building a full system that can gain global acceptance. SDSC's Amit Majumdar, who leads the Scientific Computing Applications Group, attended the workshop to provide guidance on supercomputer modeling and data cyberinfrastructure for the project. Organized by the Global Forest Carbon Measurement Working Group, the workshop was sponsored by the National Geographic Society, the Clinton Climate Initiative-Carbon and Poverty Reduction Program, Blackstone Ranch Institute, and World Vision. The project will hold additional international workshops and prepare a report to guide future efforts.
 
 
 SDSC and Partners Successfully Process One Billion Files
SDSC, in partnership with IBM, Data Direct Networks Inc., and Brocade, has successfully processed one billion files at speeds never before seen in the industry. Formally announced at SC07, the demonstration was completed using a single instance of IBM's General Parallel File System (GPFS) and DataDirect Networks' S2A9550 Storage System. In preparation for the demonstration, IBM assembled a GPFS cluster at SDSC using 17 eight-way cluster members. The Billion File demonstration successfully proved that the GPFS cluster was able to efficiently process metadata -- vital descriptive information about the files - thus enabling more than a billion files to be scanned. The test also proved that candidates for migration could be identified and moved to SDSC's HPSS (High Performance Storage System) tape multiple times daily. SDSC has long been a leader in data-intensive computing and storage, and the center's experience with both GPFS and HPSS made it an ideal location for this groundbreaking achievement.
 
 
 Weather Simulation Records Set
A team of researchers from the National Center for Atmospheric Research, SDSC, Lawrence Livermore National Laboratory, and IBM Watson Research Center has set U.S. records for size, performance, and fidelity of a computer weather simulation. The record simulations are an important step toward more accurate weather forecasts and climate research on the next generation of petascale supercomputers. The collaborating computer and weather scientists enabled the Weather Research and Forecast code to set a U.S. speed record for a weather model of 8.8 teraflops, using 12,090 processors on the Cray XT4 supercomputer at the DOE National Energy Research Computing Center. The team also set a record for "parallelism" for this size problem on the IBM Blue Gene/L at Brookhaven National Laboratory, scaling the model to 15,360 processors. The simulations were 32 times larger and required 80 times more computational power than previously achieved with the code, capturing key features of the atmosphere never before represented in simulations covering such a large part of the Earth's atmosphere. The work was presented at SC07, where it was a finalist for the prestigious Gordon Bell Prize.
 
 
 Digital Eyes in the Sky Help Battle Wildfires in Southern California
Video and still images available in real time gave vital information to both fire crews and local residents in the San Diego area, informing them of the rapidly changing location and severity of threats to life and property as the catastrophic October wildfires raged out of control. Mounted on mountaintop network towers across San Diego county, remote cameras operated by the NSF-supported High Performance Wireless Research and Education Network (HPWREN) served as eyes in the sky, even as fires reached towers on which they stood, making the situation too hazardous for human observers. The HPWREN network is led by SDSC researcher Hans-Werner Braun with partners at the Scripps Institution of Oceanography and San Diego State University. During the fires, the network supported access to camera images for more than 10,000 users. In addition to assisting state and local authorities during emergencies, HPWREN is used for network analysis research and provides high-speed Internet access to remote areas for field researchers in geophysics, astronomy and ecology and has supported educational opportunities for rural Native American learning centers and schools.
 
 
 ShakeOut: Unleashing a Massive Virtual Earthquake
Researchers from the Southern California Earthquake Center collaborated with SDSC computational scientists to achieve the largest and most realistic simulations ever of a magnitude 7.8 earthquake on the southern San Andreas Fault - "the big one." These "virtual earthquakes" are important to underpin improved seismic hazard estimates, better building codes in high-risk areas, and safer structural designs, potentially saving lives and property. Following extensive efforts to improve the model efficiency on today's larger supercomputers, the researchers used NSF-supported systems at SDSC and the Texas Advanced Computing Center to capture ground motion frequencies as high as 1 Hertz, twice as high as previous simulations. Displaying the simulation results in SDSC-produced visualizations helped the scientists gain new insights into how such massive earthquakes may shake Southern California and affect tall buildings in the Los Angeles area, where peak predicted motion reached more than 3.5 meters per second. Using a close grid spacing of just 100 meters resulted in 14.4 billion grid points for the 600 by 300 km area, eight times more than in previous simulations, capturing the higher frequencies and producing some 3 terabytes of data which will be archived in the SCEC Digital Library at SDSC.
 
 
 Members Named to Blue Ribbon Task Force on a Sustainable Digital Future
An international group of experts has been named to address one of the greatest challenges of our time: how best to preserve and efficiently access vast amounts of vital digital data well into the future - and do so in an economically sustainable manner. While the preservation of digital information has been widely discussed, the Blue Ribbon Task Force on Sustainable Digital Preservation and Access is unique since its two-year charter is focused on the economic issues surrounding sustainable repositories, and identifying practical solutions. The Task Force will convene a broad set of international experts from the academic, public and private sectors and publish both an interim report at the end of 2008 and a final report in late 2009 that will include a set of actionable recommendations for digital preservation, taking into account a general economic framework to establish those objectives. The first meeting of the task force will be held in late January in Washington, D.C.
 
 

back to top