Skip to content

ALPHA PROJECTS | Contents | Next

Katherine Yelick
UC Berkeley
Susan Graham
UC Berkeley
Charles Peskin
David McQueen
Nathaniel Cowen
Courant Institute of Mathematical Sciences, New York University
Phillip Colella
Lawrence Berkeley National Laboratory
Scott Baden
UC San Diego
Joel Saltz
University of Maryland


McQueen, D.M., and C.S. Peskin. 2000. A three-dimensional computer model of the human heart for studying cardiac fluid dynamics. Computer Graphics 34:56–60.

McQueen, D.M., and C.S. Peskin. 1997. Shared-memory parallel vector implementation of the immersed boundary method for the computation of blood flow in the beating mammalian heart. Journal of Supercomputing 11(3):213–236.

Yelick, K., L. Semenzato, G. Pike, C. Miyamoto, B. Liblit, A. Krishnamurthy, P. Hilfinger, S. Graham, D. Gay, P. Colella, and A. Aiken. 1998. Titanium: A High-Performance Java Dialect. Concurrency: Practice and Experience, September-November 1998:825–36.

Adaptive Computations for Fluids in Biological Systems

The human heart must beat unceasingly, presenting researchers with major obstacles in getting an inside look at how the beating heart actually works. Katherine Yelick, a computer scientist at UC Berkeley, is leading a new NPACI alpha project to develop advanced software tools that will enable researchers to use today's largest supercomputers to run simulations of the heart that are more detailed than ever before--an important step forward in revealing the heart's hidden mysteries and designing improved artificial heart valves.


Charles Peskin and David McQueen of the Courant Institute of Mathematical Sciences at New York University have successfully simulated blood flow in the heart for several years on earlier generations of supercomputers. But the swirling motions of blood in the beating heart are so intricate--especially around the delicate valve leaflets that control blood flow--that certain features of the flow would "slip through the cracks" in their model, which was limited on even the largest previous computers to a grid spacing that was too coarse to capture all the necessary detail.

Today's terascale supercomputers such as NPACI's Blue Horizon at SDSC, capable of more than one trillion calculations per second with half a terabyte of memory, offer Peskin the promise of greater detail and realism in his simulations of blood flow in the heart. But this increased computing power comes at a price: It is more complex to write computer code that will run efficiently on parallel machines like Blue Horizon. So Yelick and her computer science colleagues are collaborating in an NPACI alpha project to develop software tools that will enable these models to efficiently take full advantage of terascale supercomputers.

"We have two main goals in this project, one concrete goal is to get the heart code developed by Peskin and McQueen, which has been running on smaller machines, running at higher resolution on the larger Blue Horizon," Yelick says. "The other goal is to make a generic immersed boundary code that will run on distributed parallel machines available to the computational biology community." This will make it possible for other researchers to simulate heart and other important biological flows on parallel supercomputers more easily and accurately than previously possible.


This NPACI alpha project is working toward an end-to-end demonstration of how a modern parallel language and compiler, Titanium, running on Blue Horizon with improved equation solvers and algorithms for handling adaptive computational grids, can support an important scientific application--simulating blood flow in the human heart. The researchers demonstrated parts of the project at SC2000.

"We feel it's important that scientists working on a significant problem are involved right from the start. This way we're testing our software on a real-world application as we go," Yelick says. "But such collaboration is also challenging, because we have to think outside our own discipline and speak each other's languages to communicate and solve problems."

The researchers are collaborating on two levels. One level is within the computer science field with project leader Yelick and Susan Graham of UC Berkeley, Phillip Colella of Lawrence Berkeley National Laboratory (LBNL), as well as Scott Baden of UC San Diego and Joel Saltz of the University of Maryland and Johns Hopkins University. In a second level of collaboration, the computer science participants are cooperating across disciplines with heart researchers Charles Peskin and David McQueen who developed the specific application being supported, and with Nathaniel Cowen who, with McQueen and Peskin, has written a general purpose software package for the immersed boundary method.

Each researcher is contributing a different component. Yelick is porting the Titanium language, which provides greater support for parallel computing, to Blue Horizon, as well as porting the immersed boundary code to Titanium and developing scalable solver technology for uniform grids; Colella is developing improved algorithms for handling adaptive computational grids, particularly for flows modeled by the immersed boundary method; Baden is developing communication support based on Kernel Lattice Parallelism (KeLP) for grid-based computation on Blue Horizon; and Saltz is hardening the Titanium front-end for the Active Data Repository (ADR) storage facility to handle the immense data sets generated in these realistic simulations. On the application side, Peskin and McQueen are supplying the heart-specific features of the simulation model and, with Nathaniel Cowen, the general immersed boundary code for simulating biological flows.

In addition to making the modern Titanium language available on Blue Horizon, key components including the scalable adaptive solver will also be provided to researchers as part of the NPACI infrastructure, along with the full heart model.

Top| Contents | Next


"The most important reason we developed the heart model was to help design improved artificial heart valves," Peskin said. But modeling the heart presents a number of challenges: Beyond the usual difficulties of modeling fluid flows within rigid boundaries, the heart walls and valves move and interact with the flow, both driving and responding to it. And not only is the heart muscle elastic, it is active--contracting and relaxing, with elastic properties that change during the contraction-relaxation cycle.

Thus, the researchers need to solve both the fluid mechanics and elasticity problems simultaneously. To accomplish this, the immersed boundary method models biological systems like the heart as a set of elastic fibers immersed in an incompressible fluid, which avoids the complexities of applying boundary conditions on the moving location of the heart walls. For greater realism, the researchers try to make the model fibers follow the same paths as muscle and collagen fibers in the real heart muscle and valves (Figure 1).

Running on Blue Horizon, the new version of the heart code will enable both finer grids to be used, capturing more of the intricate flow detail, and more intelligent adaptive mesh algorithms that will "zero in" as the flow evolves during the simulation on the computationally challenging areas where the flow is more complex, such as near the delicate valve leaflets.

"Being able to run on Blue Horizon will give us more accurate and realistic simulations, especially of the vortices shed by the valve leaflets," Peskin said. "The biggest problem in designing mechanical heart values is to avoid clotting, and very fine-scale differences in fluid flow can influence clotting. In either of two cases--if there's too much shear or if the flow is stagnant around the valve--you may get clotting, so simulations at higher resolution will be an important step toward designing improved mechanical heart valves on the computer." (Figures 2 and 3)

Beyond heart simulations, the immersed boundary method has a wide range of applications. Peskin and other groups in the U.S. and Europe have used it to model biological systems such as embryo growth and platelet coagulation during blood clotting, as well as other systems including intracellular fluid dynamics with Brownian motion, high-Reynolds number swimming, the fluid and tissue mechanics of the brain, and fiber-fluid interactions in the papermaking process.

Figure 1. Ventricular Ejection
Flow through the aortic valve during systole (ventricular ejection). At this time in the cycle the aortic valve is open and the mitral valve is closed. In Figures 1–3, the heart is represented as a wire-frame model where the "wires" are heart muscle fibers (light brown) and valve and artery fibers (white) with oxygen-enriched blood on the heart's left side (red) and oxygen-depleted blood on the right side (blue).
Figure 3. Ventricular Vortices
A section through the heart showing flow patterns during early diastole (ventricular filling). This point of view shows vortex motion in the right ventricle.
Figure 2. Mitral Valve Flow
A thin section through the heart showing flow patterns during early diastole (ventricular filling). This point of view shows vortices behind the leaflets of the mitral valve in the left ventricle. Being able to accurately simulate the vortices shed by valve leaflets is important in improving the design of artificial mechanical heart valves.


The goal of this NPACI alpha project is, through this specific application in heart modeling, to address the broader need for software support on distributed parallel machines. In the past researchers could reasonably expect to be able to write codes that would solve their scientific problems, but today that has become less true as parallel architectures have become more complex. And today's larger machines make possible more sophisticated calculations, which also involve more complex computational methods, models, and coding. So the opportunity to solve larger and more realistic problems is there, but the path to reach this goal is demanding, leaving what some call a "software void."

"One of the primary motivations for developing Titanium was to enable easier programming on today's parallel architectures," Yelick says. "By providing high level array abstractions and avoiding tedious message passing code, Titanium allows application researchers to focus on their science and how to best model it." Titanium has a global address space to help programmers express the kinds of complex distributed data structures that arise in fluid dynamics simulations--in particular, those involving adaptivity or irregular boundaries. Titanium also provides a uniform model for machines like Blue Horizon that have a mixture of shared and distributed memory in hardware, effectively hiding this distinction within the compiler and run time system.

The resulting software will form part of the growing NPACI grid infrastructure. "This project involves a number of us in the university computer science community creating modern, portable tools for terascale parallel computing that will enable researchers to develop important biological simulations with greater ease," Yelick says. "Once these components are developed, NPACI and SDSC will render a valuable service by helping disseminate, maintain, and train people in the use of this code for biological flow modeling, as well as modern parallel languages like Titanium."
--PT *