Grand Challenges in Data-Intensive Discovery

10/26/10  -  10/28/10

"Grand Challenges in Data-Intensive Discovery" conference was held October 26 - 28, 2010 at the San Diego Supercomputer Center (SDSC) on the campus of UC San Diego.

Science has entered a data-intensive era, driven by a deluge of data being generated by digitally based instruments, sensor networks, and simulation devices. Hence, a growing part of the scientific enterprise is associated with analyzing such data, and such analysis places special demands on computer architectures because the associated calculations have frequent I/O accesses, large memory requirements, and often limited parallelism.

In mid 2011, SDSC will deploy a unique data-intensive high performance computing system called Gordon. Gordon will be a peer-reviewed allocated resource on the National Science Foundation's TeraGrid available to any US researcher. It will have a peak speed expected to be in excess of 200 Teraflops and feature very large shared memory nodes and 1/4 petabyte of flash SSD memory to vastly accelerate large database and data mining applications.



Presenter Materials

Needs and Opportunities in the Humanities

Peter Bajcsy
Research Scientist , Adjunct Professor
National Center for Supercomputing Applications (NCSA)
Institute for Computing in Humanities, Arts, and Social Science (I-CHASS)
University of Illinois at Urbana-Champaign (UIUC)

Enabling Data Sharing in Biomedical Research

Aziz Boxwala
Associate Professor
University of California, San Diego

Challenges in analyzing a High-Resolution Climate Simulation

John M. Dennis, Matthew Woitaszek
NCAR