SDSC Helps with Turbulence Simulation
PI: P.K. Young, Georgia Institute of Technology
SAC Staff: Dmitry Pekurovsky (SDSC), Giri Chukkapalli (formerly SDSC)
Noted researcher in computational approaches to fluid turbulence, Professor P.K. Yeung of the School of Aerospace Engineering at Georgia Institute of Technology is probing the details of turbulence through Direct Numerical Solutions (DNS) of the governing Navier-Stokes equations.
The study of how materials flow and mix is one of the most difficult physical problems in science and engineering. Understanding how materials mix can provide insights to a wide variety of problems, from the patterns cream makes as it is stirred into a cup of coffee, to the way pollution from a factory smokestack disperses through the air. Prof. PK Yeung of Georgia Tech, Ph.D. student Diego Donzis, and SDSC Strategic Applications Collaboration staff Dmitry Pekurovsky and Giri Chukkapalli (now with Sun Microsystems) have been working on large-scale simulations of turbulent mixing which can some day utilize petascale resources. Yeung's Direct Numerical Simulation (DNS) code has been modified in a way to allow expanded scalability. This work focused on three-dimensional Fast Fourier Transform (FFT) module, which is the most time-consuming part of the code. The code was benchmarked on IBM's 114-Teraflop Blue Gene W system and demonstrated that the code scales well, achieving 85 percent scalability for a 4096^3 run between 16,384 and 32,768 processors. Such benchmarking efforts are critical to prepare codes for petascale runs and are part of the intense application work to maximize the use of next-generation HPC resources.
Yeung is working with computational experts at SDSC exploring modifications to his code that can help scale up toward the elusive goal of 4,096 grid points per side, bringing additional realism to simulating turbulent flows.
Perhaps the most important unsolved problem in classical physics, an improved understanding of turbulence will have a wide range of applications in both basic science and applied technology in fields from aeronautics, meteorology, and combustion to environmental problems. Turbulent flows are characterized by disorderly fluctuations across a wide range of interacting scales in time and space in an "energy cascade." Because of this, turbulence is highly effective at mixing heat or chemical species and dispersing contaminants. Despite decades of study, however, the complexity of these flows has resisted theoretical description, limiting researchers' ability to predict natural phenomena and design improved engineering technologies.
Figure 1. Turbulent fluctuations in this fluid jet promote efficient mixing. Large simulations by P.K. Yeung on SDSC's DataStar are giving new understanding of the complex mechanisms of mixing and contaminant dispersion, including for slow-diffusing substances that have been difficult to study. Image courtesy of K.R. Sreenivasan, University of Maryland.
Recent advances in computational power are opening significant opportunities to advance the understanding of turbulence. "Computing has matured into an important alternative and complement to experiments," said Yeung. "And the expertise, large computing resources, and fast turnaround at SDSC are helping us make significant progress." More realistic simulations are now generating large amounts of data at such high resolution that they sometimes surpass the detail that can be obtained in experiments. Consequently, there is growing interest in these simulation results, and Yeung is working to establish a resource that will make this valuable data available to the wider community.
One of the larger users of SDSC resources, Yeung is running on SDSC's DataStar supercomputer with an allocation of some 1.2 million CPU-hours. In his talk, Yeung presented recent advances in his simulations made possible by using a cube domain with 2,048 grid points per side, double the previous 1,024, for a total of some 8 billion grid points. These simulations are the highest resolution of their kind to date in the U.S. As the problem size grew to 2,048 points per side, the researchers achieved an 80% efficiency in scaling, with the computations running on 1,024 processors of DataStar using some 560 Mb of memory. Yeung generated 20 data sets of the full velocity field of 160 GB each, for a total of more than 3 TB of data.
One result of interest in environmental problems involves the spread of contaminants released in plumes. Yeung's simulations showed that particle separation statistics are larger than anticipated, implying that small amounts of contaminant can be found farther from a source than expected. Yeung noted that "these simulations demonstrate that you need to look at higher order statistics of the flow, not just the average or variance, to accurately predict how far contamination may spread." According to Yeung, the power of DataStar has enabled him to address challenges that arise in simulating the behavior of substances that diffuse slowly, such as in liquids, which have previously been very difficult to simulate.
As detailed as the present simulations are, there are still important questions about turbulence that can only be answered through still higher resolution simulations. In the future, Yeung would like to extend his research to double the domain size, 4,096 grid points per side, for a total of more than 65 billion grid points. Although simulations of this type have been done on Japan's Earth Simulator, it is not yet possible even on the largest computational resources available in the U.S. Working with SDSC computational scientists Giri Chukkapalli and Dmitri Pekurovsky in a Strategic Applications Collaboration (SAC), Yeung is now exploring modifications to his code that can help scale up toward the elusive goal of 4,096 grid points per side, bringing additional realism to simulating turbulent flows.