Skip to content

ALPHA PROJECTS | Contents | Next

Multi-Component Models for Energy and the Environment

Mary Wheeler, University of Texas at Austin

Joel Saltz, University of Maryland, Johns Hopkins University

Alan Sussman, Tahsin Kurc, University of Maryland
Scott Baden, UC San Diego
Carl Kesselman, USC/Information Sciences Institute
Malgo Peszynska, Steve Bryant, Robert McLay, Clint Dawson, Jichun Li, University of Texas
Jay Boisseau, Richard Frost, Tim Kaiser, SDSC

W hat steps can be taken to protect shore birds and other nearby wildlife when an oil tanker runs aground, spilling its cargo into coastal waters? How can petroleum engineers wring the last drop of oil from an oil field while avoiding groundwater pollution? Simulations can suggest solutions for both of these problems, but the most accurate answers require scientists to combine complex mathematical models for different aspects of the physical environment, whether it's an underground reservoir or a shallow estuary. An alpha project, led by Mary Wheeler at the University of Texas and Joel Saltz at the University of Maryland, is developing the models, and a framework for combining them, to help formulate strategies to minimize environmental damage in the area of an oil spill and to recover oil safely and efficiently.





"The immediate goal here is to improve our ability to model the flow of contaminants through the ecosystem accurately by coupling groundwater and surface-water simulations," said Saltz, project co-leader, NPACI Programming Tools and Environments thrust area leader, and computer science professor at the University of Maryland. "The broader goal is to integrate a Grid-based infrastructure that will allow on-demand simulation, exploration, and comparison of multiple scenarios of great importance to energy and environmental modeling." Saltz is also a professor in the Department of Pathology at Johns Hopkins University and the director of pathology informatics at the Johns Hopkins Medical Institutions.

One of the advantages to having such combined models is to be able to simulate all of the behaviors of an environmental cleanup target or an oil field, since the oil and pollutants interact with water on the surface as well as in groundwater. Combining models can show where oil will go in the case of a spill--for example, whether in the ground or on the surface of a body of water.

"When these models are sufficiently refined, we will be able to respond to an actual spill," said Wheeler, director of the Center for Subsurface Modeling (CSM) at the Texas Institute for Computational and Applied Mathematics (TICAM). "Given its location and the current tidal conditions, we can accurately predict how it might spread and where booms might be laid to contain it. Getting there requires tremendous cooperation of the sort fostered by NPACI, between applications programmers like ourselves and infrastructure developers such as those in the Maryland group."

The key infrastructure challenges involve the coupling of the various models, particularly the groundwater and surface-water models, in such a way that a complete simulation environment can be created. The groundwater and surface-water models can be used to simulate changes over long periods, and they generate large amounts of output data that could be used by other models to simulate various crisis-management scenarios--if that data can be captured.

Top| Contents | Next

Improved Oil Recovery
Figure 1. Improved Oil Recovery
Water movement through an oil reservoir is critical to the economics of the field. The blue contour shows the advance of water from two injection wells in the lobes of an oxbow reservoir. The timing of water arrival at these wells is important in planning oil recovery strategies. This visualization by the Center for Computational Visualization is based on simulations by the Center for Subsurface Modeling; both centers are at the University of Texas and NPACI partners. This research is sponsored in part by an NSF KDI program.


"These simulations produce data that will be used by other programs, using Meta-Chaos for communication between computer systems," said Alan Sussman, a research scientist at the Computer Science Department at Maryland. By storing the results of past ground- and surface-water simulations, researchers can access this data to drive new models such as the flow of an oil spill in a bay over time. "The key thing here is for the programs to have multiple scenarios," he said. "We can perform a simulation once, store all the data in the Active Data Repository (ADR), and use the results in other simulations. We can access ADR multiple times, so that no one has to rerun the original simulation."

Wheeler's group has developed several simulators, including the Parallel Advanced Circulation Model (PADCIRC) and UTBEST, which use various finite-element schemes to capture the circulation patterns in shallow coastal seas, bays, and estuaries. PADCIRC is an efficient, parallel simulator for distributed-memory platforms. But for a complete simulation of a bay, the PADCIRC or UTBEST hydrodynamics simulators need to be linked to a chemical transport simulator that uses the water flow patterns to show the time-dependent transport of chemicals--such as an oil spill. Since the chemical reactions have little effect on the circulation patterns, the fluid velocity data can be generated once and used for many simulations.

Modeling more than 20 variables, including sediment transport, CE-QUAL-ICM is a chemical transport simulator based on an unstructured finite-volume method, explicit in time, that can be run in one, two, or three dimensions. Wheeler's group has also developed a parallel version of CE-QUAL-ICM called PARWOM. Another model developed by the Wheeler group is UT-TRANS, which computes advection, diffusion, and reactions of contaminants in shallow waters, using a mass-conservative, upwind finite-volume method and explicit time-stepping. PARSSIM1 simulates flow and reactive transport, handling a variety of biochemical and geochemical interactions. While this code considers a single flowing phase, an arbitrary number of stationary mineral and fluid phases are possible, as is mass transfer between the phases. Finally, the Integrated Parallel Accurate Reservoir Simulator (IPARS) is a framework for multiphase and multicomponent flow and transport which currently encompasses nine different physical models and numerical algorithms and is capable of running mega-size problems in just minutes when using state-of-the-art parallel computing environments (Figure 1).

Top| Contents | Next


The proposed application scenarios require an architecture that combines efforts from NPACI's Metasystems, Programming Tools and Environments, Earth Systems Science, and Engineering thrust areas. IPARS provides a framework for linking several physical models into one simulation, but the complete simulation environment will use an NPACI-wide Globus infrastructure, including current NPACI systems and later the teraflops IBM SP.

The progress of the various NPACI projects in 1999 includes improvements to the stability of Globus; an initial implementation of linked subsurface and surface-water models using Meta-Chaos, ADR, and the Kernel Lattice Parallelism (KeLP) libraries; and Meta-Chaos interfaces to ADR, KeLP, and Globus. To help meet the alpha project's goals, researchers will utilize and combine the computational powers of these systems, as well as use previous models in the production of new ones.

"Globus services let us run multiple parallel programs simultaneously, such as ADR," Sussman said. "It starts programs at different sites, on different machines, and provides security for all of them." Globus will support remote job submission, resource allocation, communication, high-performance remote I/O, security, and resource discovery. KeLP will optimize simulations for hierarchical architectures, Meta-Chaos will coordinate communication between simulations, and ADR will store and process simulation output.

IPARS and ADR will be joined via Meta-Chaos, which makes it possible to integrate multiple data-parallel programs in a single application. It provides a bridge between code libraries, as well as between parallel applications. "For application developers, Meta-Chaos provides a relatively simple interface to the other libraries and languages," Saltz said. "For library developers, Meta-Chaos requires relatively minimal modifications to provide this bridge."

Meta-Chaos will be used to exchange distributed data structures between the surface-water and the groundwater models. Meta-Chaos is now using the services of Globus to start programs and manage the underlying interprocessor communication. The existing KeLP and Meta-Chaos interface will be used to couple the surface-water and groundwater models with each other and with ADR.

"For pollution remediation calculations with IPARS, the ADR will help by storing, retrieving, and processing data generated by surface and groundwater models, as well as sensor data sets," Wheeler said. The models will be customized to use ADR--developed by Saltz and Sussman at Maryland--to store, retrieve, process, and visualize results. ADR lets researchers build parallel database systems that integrate the storage, retrieval, and processing of multidimensional data sets on parallel machines. In this project, ADR will be demonstrated on the large data sets generated by surface-water and groundwater models and sensors.

The various models will be parallelized using KeLP, a C++ class library for implementing portable scientific applications on distributed-memory parallel computers. KeLP, developed by Scott Baden of the Computer Science and Engineering Department at UC San Diego, supplies optimal implementation on arbitrary architectures. The KeLP library is especially important for highly heterogeneous execution environments, including the teraflops IBM SP that will be installed at SDSC. KeLP lets applications adapt to data- or hardware-dependent conditions at run time, supports run-time analysis of communication, and manages elaborate, structured, irregular data transfers. Codes not already parallelized will be implemented using KeLP.

Existing parallel codes will be optimized for the teraflops machine at SDSC by Richard Frost and Tim Kaiser of SDSC's Scientific Computing group, led by Jay Boisseau. "We'll be optimizing tools such as KeLP, as well as applications that run on top of KeLP," Boisseau said. Boisseau expects the transport and optimization to be complete within the next year. The KeLP infrastructure will be extended with both hierarchical communication for tolerance of high communication delays, and gather/scatter communication to support finite-element methods.

KeLP will be integrated with ADR and Globus and generalized for static, unstructured meshes. ADR interfaces to KeLP data structures will support the very large, multi-resolution, multi-scale, and out-of-core grids used in the models. In conjunction with these interfaces, ADR will be customized to support sensor data sets and to query and process simulation data between hydrodynamic and projection codes.

"Together, our efforts within NPACI can greatly benefit the understanding, design, and testing of economically feasible oil recovery strategies, as well as decontamination strategies," Wheeler said. "These are challenges we are all eager to take on." --AV *

Top| Contents | Next