Skip to content

ENGINEERING | Contents | Next

Bridging Simulation Scales and Scientific Communities to Design Custom Materials

Greg Rodin
Associate Professor and Temple Foundation Fellow, Texas Institute for Computational and Applied Mathematics and Department of Aerospace Engineering and Engineering Mechanics, University of Texas at Austin

Furniture, sporting equipment, and military aircraft are just some of the items that today get their strength, durability, and light weight from composite materials. Composites combine two or more materials, such as thin glass fibers embedded in a plastic, for greater strength than more common building materials, such as steel. Greg Rodin at the University of Texas and a team of NPACI collaborators are developing computational methods to simulate how composites deform and ultimately break under strain. These methods will make it possible to save time and money by designing materials with desired features more quickly than the traditional trial-and-error methods in a laboratory.

Rodin is leading an NSF Grand Challenge research project that is developing methods to study much larger pieces of composites than current simulations can handle. Yet these same pieces are still too small to represent important features of the materials at a size that would actually be used to build large engineering structures like airplanes. Rodin and NPACI collaborators are working to simulate still larger pieces of materials using NPACI's parallel computers while improving the accuracy of the underlying models.




Stress in a CompositeStress in a Composite

A simulation by graduate student Kumar Vemaganti of the Texas Institute for Computational and Applied Mathematics shows effective stress field in a composite lamina. High stress concentration at the interface between the matrix and fibers leads to debonding.


"Current computational methods are nowhere near being able to solve 3-D models of fiber-reinforced composites as large as the human hand, much less an airplane," Rodin said. "The bottom line is really the problem size, and these problem sizes are important for creating realistic materials."

A piece of composite material about the size of a human hand might have up to 10 million fibers, each 10 microns thick. Typically, the fibers are arranged in one of several standard architectures--aligned, woven, or cross-hatched, for example--and each arrangement gives a material different qualities. The fibers are then embedded in the matrix, an epoxy or other material that gives the composite its shape. To perform accurate simulations, the model for such a hand-sized piece might have 10 billion equations.

"No one else is able to look at several hundred fibers embedded in a matrix in 3-D," Rodin said. "Our goal is to look at systems with up to 10,000 fibers in 3-D on today's generation of hardware. Further, we believe that new statistical descriptions of composite materials will allow us to analyze truly large engineering problems."

The code developed by Rodin's team at the Texas Institute for Computational and Applied Mathematics (TICAM) is called FLEMS, for Fast Linear Elastic Many-particle Solvers. Most codes usually fail when they try to model more than two fibers in three dimensions. FLEMS, however, on a workstation can currently can solve 3-D problems with several fibers and on small parallel systems can model tens of fibers.

Under NPACI, FLEMS is being expanded to larger models with hundreds and thousands of fibers. Part of the effort is going toward making FLEMS scale to larger parallel systems--moving from the IBM SP and CRAY T3E at the University of Texas to the large IBM SP computer at SDSC. At the same time, the researchers are improving the numerical methods on two fronts, first by making them more efficient, and second by making them more accurate. The improved accuracy stems from integrating techniques developed at Caltech's Materials and Process Simulation Center (MSC), directed by William Goddard.

No one else is able to look at several hundred fibers embedded in a matrix in 3-D. Our goal is to look at systems with up to 10,000 fibers in 3-D on today's generation of hardware.

Greg Rodin, TICAM


The MSC's goal is to develop techniques that look at materials at different scales--from quantum to molecular to mesoscale to continuum levels. "It's only at the continuum level that you're actually looking at a hunk of material," said Richard Muller, director of Quantum Simulations at the MSC. "The collaboration with Rodin's group is expanding our techniques into the mesoscale and continuum scale. This is critical for solving real systems, and it's a very exciting place to be."

Like most materials engineering work, the methods from the Texas group look at the epoxy matrix as a continuous substance that smoothly engulfs the fibers. The methods do not take into account the molecular structure of the matrix or the fibers. For large objects, this is a reasonable assumption. However, because the fibers are so small, only 10 microns thick, this assumption may fail.

The continuum assumption, for example, says that the fibers are always in contact with the matrix--there are no gaps--and the fibers don't slide within the matrix. Reality, however, is another matter. Under pressure or stress, the fibers slip within the matrix and form gaps.

"This is how cracks start," Muller said. "Cracking and slipping are ways to relieve stress. We are constructing a model using quantum and classical mechanics to describe the energy involved in cracking and sliding."

In particular, the Texas-Caltech collaboration is looking at the interphase material--the transition zone between the fiber and the matrix. The interphase, which may be only a few atoms thick, is the starting point of the behavior observed for the material as a whole, and capturing the behavior accurately requires atomistic and quantum simulations.

The MSC is developing the tools to allow Rodin to do quantum simulations and apply them to practical problems of interest to the engineering community. Thus far, MSC researchers have parallelized their codes to run on NPACI's 256-node HP Exemplar at Caltech and the Origin2000 at NCSA. "This phase has progressed far enough that now we are moving on to address the research questions that we're really interested in," Muller said.

Eventually, the continuum simulation will incorporate results from a quantum simulation running in parallel on another platform by transferring data through NPACI's data-intensive computing environment.

Where Damage Starts
Where Damage Starts

A simulation by Vemaganti shows where damage begins, leading to a loss of local stiffness, in a composite with embedded particles. Brighter colors correspond to higher effective stress in the inclusions after damage is initiated.


As the work progresses and efficient parallel methods at both the atomistic and continuum scales run on higher-performance machines, the project will take advantage of other NPACI expertise in the Data-intensive Computing and Interaction Environments thrust areas. In one effort, the simulations will be enhanced by incorporating input from X-ray computed tomography. The Active Data Repository project at the University of Maryland, led by Joel Saltz, is developing just the sort of tools to handle these large tomography data sets. Farther down the road, the simulations will incorporate interaction tools for steering the computations, visualizing the results, and correlating computational and experimental data.

At the moment, however, Rodin is focused on the Caltech collaboration. "Other NPACI collaborations will take time," he said. "They will involve overcoming cultural barriers." For now, NPACI has already made the connection between two communities who have long looked at the same problem at different levels of detail.

"This project is bridging scales--the interface between atomistic and continuum simulations," Rodin said. "At the same time, we are also bridging a major gap that has traditionally separated the physics and engineering communities. NPACI has constructed that bridge."