Skip to content

ALPHA PROJECTS | Contents | Next

Joel Saltz
University of Maryland and Johns Hopkins University
Mary Wheeler
University of Texas
Carl Kesselman
University of Southern California Information Sciences Institute
Alan Sussman, Tahsin Kurc
University of Maryland
Scott Baden
UC San Diego
Steve Bryant, Shuyu Sun, Clint Dawson, Malgo Peszynska
University of Texas

Multi-Component Models for Energy and the Environment

T he use of land disposal as a solution to America's hazardous waste cleanup problem came to a halt in 1986 when Congress passed a Superfund law amendment that demanded the development of more permanent and less costly solutions for dealing with contaminated materials. Since then, the uses of more suitable treatment technologies have progressed--thanks to researchers across the country, including NPACI scientists, Joel Saltz at the University of Maryland and Johns Hopkins University and Mary Wheeler at the University of Texas. Led by Saltz and Wheeler, the Multi-Component Models for Energy and the Environment alpha project continues to improve the ability to accurately model contaminant flow through an ecosystem by coupling one-of-a-kind groundwater and surface-water simulations.

Soon after the Superfund amendment was passed, the Department of Energy (DOE) was charged with cleaning up thousands of individual restoration sites in 30 states--which consequently affects more than 50% of the drinking water in the United States. With $140 billion expected to be spent between 2000 and 2070, the current methods of restoration are both costly and labor-intensive.

Trenches are dug and soil samples are extracted to determine underground characteristics. Conditions above ground must also be determined. To couple underground and sub-surface conditions is very complex--yet necessary to the overall understanding of a restoration site. One solution to this problem is to use multi-component models such as those being developed by Wheeler and Saltz in this alpha project.

"Multi-physics, multi-scale models allow for strategic definition of cost-effective contaminant remediation, as well as efficient coordination and management of cleanup work," said Saltz, professor of Computer Science at Maryland and director of the Pathology Informatics Division at Johns Hopkins. "These models can also be used toward efforts to improve oil and gas production, such as the optimization of well placement, and for biomedical applications such as blood flow modeling."

Figure 1. Combined Models
Applications for multi-scale fluid flow for energy and the environment range from relatively common oil reservoir simulations to more complex geomechanical and geochemical models.

The broader goal of the alpha project is to integrate a grid-based infrastructure that will generate on-demand simulation, exploration, and comparison of multiple scenarios that are of great importance to energy and environmental modeling. More specifically, the project demonstrates environmental and engineering scenarios running on NPACI's Blue Horizon that combine application models via data management systems like the Active Data Repository (ADR), the Kernel Lattice Parallelism (KeLP) libraries, MetaChaos, and Globus. Integration of a suite of tools will further benefit a wide range of applications communities, such as materials design, to develop multi-scale, multi-component simulation codes that address even more complicated problems (Figure 1).

Figure 2. Parallel Subsurface Simulator
Using ParSSim, UT Austin researchers have computed the single-phase flow field in a 15 ft. x 15 ft. x 150 ft. volume of simulated carbonate rock created from measurements on an outcrop in West Texas.
(A) A contour plot of the magnitude of the velocity field with isolated bright spots of higher velocity. (B) Introducing highly reactive fluid into this medium from the plane at X=0 begins to focus fluid flow into channels shown by bright spots. (C) After 58 time steps, four channels are carrying essentially all the flow in the first 30-40 feet. (D and E) After 112 time steps, a single wormhole has emerged as the principal conduit for flow in the first 125 feet. To evaluate the average behavior, researchers must run many such realizations, each realization producing tens of gigabytes of output, so researchers must ultimately seek and visualize patterns from a terabyte data set. The MetaChaos and ADR tools have been harnessed for this purpose.


One of the advantages of having multi-component models is the ability to simulate all of the behaviors of an environmental cleanup target or an oil field, since the oil and pollutants interact with water on the surface as well as in ground water. "Combining models can show where oil will go in the case of a spill--whether in the ground or on the surface of a body of water," said Wheeler, director of the Center for Subsurface Modeling at the University of Texas. "When these models are sufficiently refined, a response can be made to an actual spill."

Given the location of a spill and the current tidal conditions, the researchers can accurately predict how it might spread and where booms might be placed to contain the oil. Achieving this goal requires tremendous cooperation of the kind fostered by NPACI, between applications programmers like those in Wheeler's group at the University of Texas, and infrastructure developers such as those in NPACI's Programming Tools and Environments and Metasystems thrust areas. Together, the team's efforts within NPACI can greatly benefit the understanding, design, and testing of economically feasible oil recovery strategies, as well as decontamination strategies.

The alpha project's progress was demonstrated at SC2000: High-Performance Networking and Computing in Dallas, Texas. As part of the SC2000 Net Challenge, the research team demonstrated reservoir simulation and history matching scenarios by using grid-based computing and interactive data set exploration. The Parallel Subsurface Simulator (ParSSim) which couples flow and reactive transport simulations across platforms via MetaChaos was used to generate data that was then stored into and analyzed via ADR (Figure 2). ParSSim, which is a scalable parallel simulator, continues to run successfully on heterogeneous machines via Globus, including both local clusters and Blue Horizon. The simulator coupled with MetaChaos can also be ported to Cray T3Es and workstations.

UTPROJ3D, a single-phase flow projection code is also being optimized by Wheeler's group. The UTPROJ3D code couples 3-D surface water flow models to contaminant and salinity transport models, which can be used as ground water codes. UTPROJ3D uses the conservative velocity projection method, improves local mass conservation, and formulates projection based on the mixed finite element method. The 3-D projection code is able to manage the most irregular problems, as it employs non-matching unstructured grids and uses "mortars" to glue the grids together.

Figure 3. Improved Resolution
Increasing computational power means more detailed and accurate simulations of our environment. For instance, the image on the left represents a computational grid of the Atlantic/Gulf of Mexico in 1994, while the right is a representation created in 2000.


The modeling of flow and reactive transport can involve many variables, time steps, and fine grid resolution, resulting in terabytes of simulation data. Fast access is critical to efficient interpretation of this data, and computational demands are intense. The database tools that support researchers' needs to synthesize data play a significant role within the project. These tools, such as ADR, can aggregate data in various ways to answer questions about what would happen in given locations under different conditions. The tools can also be used to identify instances of a given physical phenomenon in each scenario, and allow further aggregation operations on instances of the physical phenomena across realizations.

KeLP, MetaChaos, and ADR are essential components for the completion of an accurate multi-component modeling system. ADR, which was developed as a set of modular services implemented in C++, is used for storage and and processing of simulation output data. KeLP, a C++ class library, implements portable scientific applications on distributed-memory parallel computers and optimizes simulations for hierarchical architectures, while MetaChaos coordinates communication between simulations.

"KeLP is especially important for highly heterogeneous execution environments, such as Blue Horizon," said UC San Diego researcher Scott Baden, who developed the tool. "We developed it so that applications can adapt to data- or hardware-dependent conditions at run time so that it's able to implement non-parallel codes. The KeLP library has also been extended with both hierarchical communication for tolerance of high communication delays, and gather-scatter communication to support finite-element methods."

Work also continues on application development using ADR for history matching and the Integrated Parallel Accurate Reservoir Simulator (IPARS) for performing the simulations. Specifically, the application is using IPARS to carry out a set of reservoir simulations using different parameter values, while the output (both well data and grid data) is being stored in ADR. IPARS is portable across several serial and parallel platforms, includes more than eight multiphase and multi-component models, and uses multi-block grids, which can be non-matching. Perhaps the most novel aspect of this application is the multiphysics capability which allows for coupling of flow models with each other as well as with third-party codes for geomechanics or surface water. Together IPARS and ADR will eventually be used to examine petroleum production scenarios and history matching.
--KMB *