SDSC Thread Graphic Issue 5, April 2006





RSS RSS Feed (What is this?)

User Services Director:
Anke Kamrath

Editor:
Subhashini Sivagnanam

Graphics Designer:
Diana Diehl

Application Designer:
Fariba Fana


Featured Story Corner

Turbulent Mixing at High Schmidt Number

—Dr. P.K. Yeung, Georgia Institute of Technology

The word "turbulence" is often used to describe certain situations of political unrest, economic uncertainty, labor strife, bumpy airplane rides, and, of course, many fluid motions occurring in nature and engineering. A common feature in all of these is the small image to link to turbulence videos presence of disorder, complexity, and lack of predictability in detail even though some definite social or physical constraints do apply. In particular, fluctuations in fluid turbulence appear to be random despite being governed by partial differential equations that express basic and deterministic laws of conservation of mass, momentum, and energy. Although these characteristics of turbulence may seem undesirable, life without turbulence would be strange and difficult, if not impossible; for then coffee and cream will not mix, automobile engines may not run, pollutants will not disperse, and so forth, at least not in reasonable time. These examples show that the mechanisms by which turbulence causes efficient mixing and reduces spatial nonuniformities of momentum, heat, and chemical substances are central to our ability to understand natural phenomena and design improved engineering devices.

Fluid turbulence is a Grand Challenge computational problem for which use of state-of-the-art cyberinfrastructure has always been of great importance. Even in its simplest form (without couplings with combustion chemistry, atmospheric phenomena, etc.) the inherent characteristic of disorder in turbulence is such that the flow is always unsteady in time and is always three-dimensional in space. Therefore, attempts to capture the full flow physics by numerical simulation require the flow variables to be calculated for a period of many time steps at a large set of grid points. More precisely, computational requirements are set by the range of scales over which disorderly fluctuations occur and interact in a nonlinear manner. The range of scales for the velocity fluctuations (which drive the mixing) is determined by a parameter called the Reynolds number, which depends on flow speed, body dimension or size of flow domain, and viscosity, and is high in most applications. In the case of turbulent mixing, we usually consider fluctuations of a "passive scalar"—which may, for example, be a substance concentration in a dilute mixture or small temperature difference—which does not affect the flow. The range of scales involved then depends further on the molecular diffusivity, which for heat transfer would be highest in liquid metals that conduct heat efficiently and lowest in organic liquids where molecular diffusion is slow and large temperature or concentration differences can develop. We usually refer to molecular diffusivity by the Schmidt number (Sc), which is the ratio of fluid viscosity to molecular diffusivity of the diffusing substance or property. The case of high Schmidt number (low diffusivity) is more difficult and less understood even in traditional experiments. Fortunately, advances in supercomputing have now made it feasible to simulate high-Sc mixing directly, as well as to provide detailed data for turbulent mixing over a wide range of molecular diffusivities.

Thanks to the availability of the 15.6-Teraflop DataStar at San Diego Supercomputer Center (SDSC), we have performed the world's largest simulation of turbulent mixing at high Schmidt number (low diffusivity), using as many as 2048 processors for 20483 (8.6 billion) grid points. The velocity field in this simulation is sufficiently realistic in its Reynolds number to be described well by a set of well-known similarity hypotheses that were proposed by the famous Russian mathematician, A.N. Kolmogorov in 1941. Our new results for the scalars provide strong evidence in favor of a similarly fundamental result (known as Batchelor scaling, dated from the late 1950s) for the form of the spectrum of scalar fluctuations, which for lack of adequate data, has not been well verified in the literature previously. The physics of mixing at low diffusivity, which is important in industrial liquid-phase processing and nanomaterial synthesis operations, is in many aspects different from moderately diffusive scalars such as temperature fluctuations in air that are most easily studied in laboratory experiments. Likewise, the effects of increasing Schmidt number (e.g., by reducing molecular diffusivity) and increasing Reynolds number (e.g., by reducing the fluid viscosity) are very different in important ways that have not been well appreciated in engineering practice. For example, increasing Reynolds number usually leads to more pronounced intermittency (a tendency for localized regions of intense fluctuations, illustrated in the accompanying video sequence) that in combustion can lead to local flame extinction and reignition. We find that beyond a certain threshold depending on the Reynolds number, increasing Schmidt number causes no further increase in intermittency, while motions representing the smallest eddies in the flow take on greater importance in the local topology of the scalar concentration field.

Movies showing scalar and energy dissipation are represented by this snapshot. Video 1 (.mov, 3.4 MB)
Video 2 (.mov, 1.2 MB)
Video 3 (.mov, 1.8 MB)

Video sequences with different colors showing regions of (1) energy dissipation, (2) scalar dissipation at Schmidt number 4 and (3) scalar dissipation at Schmidt number 64 exceeding certain thresholds in our simulation on a 20483 domain. High energy dissipation represents intense straining by turbulence, whereas high scalar dissipation rate represents large gradients of scalar fluctuations in space and has a greater tendency to take very large values.

The numerical method we use to solve the governing equations (Navier-Stokes for momentum and advection-diffusion equation for the scalars) is Fourier pseudospectral in space and second-order in time. The solution domain is a periodic cube which is divided among the parallel processors into "slabs" of equal size that each contain an integral number of planes. We use the IBM ESSL library for Fast Fourier Transforms and the standard MPI library for inter-processor communication calls. The code performs at more than 80% scalability with increasing number of processors both at fixed problem size and with problem size adjusted to match memory per processor used. The same code has been ported to the IBM Blue Gene at SDSC with literally no changes required, an encouraging sign for the future, because it scales even better there than on the DataStar. Furthermore, SDSC consultants, D. Pekurovsky and G. Chukkapalli, have worked with Georgia Tech PhD student, D.A. Donzis, to produce a new code that uses an alternative data decomposition that will allow us to use even more processors than presently. When the number of processors increases in the future, the new code will scale even better than our current production version, based on benchmarking studies.

With continuing generous resource allocations and strategic staff assistance at SDSC, and with new NSF investments in Cyberinfrasture on the horizon, the future is looking very bright. Currently our 20483 grid point simulations of turbulent mixing are the largest known in both the U.S. and elsewhere. An increase to 40963 grid points (which requires a 16-times increase of CPU needs) can be expected within the next 2-3 years. Such a development will no doubt advance U.S. scientific leadership in turbulence studies from a computational science viewpoint. Furthermore, we plan to contribute to the research community by utilizing SDSC's data allocation resources to share the data with interested members of our research community and to work with them to continue to expand the large simulation data generated using precious resources at SDSC.

This research is part of a collaborative project with Professor K.R. Sreenivasan (Director of International Centre for Theoretical Physics, on leave from the College of Engineering at University of Maryland) supported by the National Science Foundation's Fluid Dynamics and Hydraulics Program. PK Yeung's research group has received approximately 2 million hours of CPU allocations at SDSC in the past two years; large computations focused on different aspects of turbulence are also carried out at two other national centers. He can be reached via email at pk.yeung@ae.gatech.edu

Did you know ..?

that SDSC has limited the core file size to 32MB.
To make good use of the core file size it is recommneded using the MP_COREFILE_FORMAT environment variable (or its associated command-line flag -corefile_format) to set the format of corefiles to lightweight corefiles.
See Thread article Issue 4, February 2006 for more details.